Sabtu, 13 Ogos 2011

computer network and communications

INTRODUCTION TO COMPUTER NETWORKS AND COMMUNICATIONS

COMPUTER NETWORK

A computer network is a system of interconnected computers and peripheral devices. For example, it may connect computers, printers, scanners and cameras. Using hardware and software, these interconnected computing devices can communicate with each other through defined rules of data communications. In a network, computers can exchange and share information and resources.
A computer network may operate on wired connections or wireless connections. When two or more networks are linked or connected and are able to communicate with one another using suitable hardware and software, it is called an internetwork.

COMMUNICATIONS

Communications is about the transfer of information from a sender, across a distance, to a receiver. Using electricity, radio waves or light, information and data in the form of codes are transmitted through a physical medium such as wire, cable, or even the atmosphere.
Therefore, in order to make communications possible from computers, across telephones and radios and back to computers and other digital devices again, there must be a signal translator, which we call – a modem. The modem, which is short for modulator or demodulator, converts digital signals into analog and back again into digital signals for information to move across the telephone line.

Jumaat, 5 Ogos 2011

ramadhan...

Bulan Ramadhan merupakan penghulu segala bulan.Ramadhan juga adalah bulan yg penuh keberkatan dan utk semua umt islam melakukuan amal ibadat sbyk yg mungkin.Malam lailatulqadar adalah malam yg dinanti nantikan oleh semua umt islam yg beribadat pd 1/3 akhir malam.Pd malam turunnya lailatulqadar, doa umt islam yg beribadat dgn penuh khusyu,insaf dan tekun akn dimakbulkan oleh Allah S.W.T,tetapi kita tidak dapat mengetahui pd mlm bilakah ianya akn turun.Oleh itu,marilah kita beramai-ramai melakukan amal kebaikan di bulan yg mulia ini.Insyaallah dosa kita akn diampunkan oleh Allah,dan hidup diberkati.

Selasa, 3 Mei 2011

byte


The byte (play /ˈbt/), is a unit of digital information in computing and telecommunications, that most commonly consists of eight bits. Historically, a byte was the number of bits used to encode a singlecharacter of text in a computer[1][2] and it is for this reason the basic addressable element in many computer architectures.
The size of the byte has historically been hardware dependent and no definitive standards exist that mandate the size. The de facto standard of eight bits is a convenient power of two permitting the values 0 through 255 for one byte. Many types of applications use variables representable in eight or fewer bits, and processor designers optimize for this common usage. The popularity of major commercial computing architectures have aided in the ubiquitous acceptance of the 8-bit size. The term octet was defined to explicitly denote a sequence of 8 bits because of the ambiguity associated with the term byte.

Ahad, 17 April 2011

Introduction Of Binary Coding

What Is ASCII
The American Standard Code for Information Interchange (ASCII, play /ˈæski/ ASS-kee) is a character-encoding scheme based on the ordering of the English alphabet. ASCII codes represent text in computers, communications equipment, and other devices that use text. Most modern character-encoding schemes are based on ASCII, though they support many more characters than did ASCII.
US-ASCII is the Internet Assigned Numbers Authority (IANA) preferred charset name for ASCII.
Historically, ASCII developed from telegraphic codes. Its first commercial use was as a seven-bit teleprinter a major revision during 1967,and the most recent update during 1986. code promoted by Bell data services. Work on ASCII formally began on October 6, 1960, with the first meeting of the American Standards Association's (ASA) X3.2 subcommittee. The first edition of the standard was published during 1963, Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters.
ASCII includes definitions for 128 characters: 33 are non-printing control characters (now mostly obsolete) 94 are printable characters, and the space is considered an invisible graphic. The most commonly used character encoding on the World Wide Web was US-ASCII until December 2007, when it was surpassed by UTF-8 that affect how text and space is processed;

Character

In computer and machine-based telecommunications terminology, a character is a unit of information that roughly corresponds to a grapheme, grapheme-like unit, or symbol, such as in an alphabet or syllabary in the writtennatural language. form of a
Examples of characters include letters, numerical digits, and common punctuation marks (such as '.' or '-'). The concept also includes control characters, which do not correspond to symbols in a particular natural language, but rather to other bits of information used to process text in one or more languages. Examples of control characters include carriage return or tab, as well as instructions to printers or other devices that display or otherwise process text.

Bit

A bit (contraction of binary digit) is the basic unit of information in computing and telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states. These may be the two stable states of a flip-flop, two positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
In computing, a bit can also be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the Arabic numerical digits 0 and 1. Indeed, the term "bit" is a contraction of binary digit. The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. In several popular programming languages, numeric 0 is equivalent (or convertible) to logical false, and 1 to true. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability,[1] or the information that is gained when the value of such a variable becomes known.[2]

Selasa, 1 Mac 2011

trojan horse

A destructive program that masquerades as a
begin application.
Trojan horses do not replicate themselves but
they can be just as destructive.
Trojan horse is a program that claims to rid
your computer of viruses but instead
introduces viruses onto your computer.