Saturday, June 18, 2011

bit

bit (contraction of binary digit) is the basic unit of information in computingand telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinctstates. These may be the two stable states of a flip-flop, two positions of anelectrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
In computing, a bit can also be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the Arabic numerical digits 0 and 1. Indeed, the term "bit" is a contraction of binary digit. The two values can also be interpreted as logical values (true/falseyes/no), algebraic signs(+/), activation states (on/off), or any other two-valued attribute. In several popular programming languages, numeric 0 is equivalent (or convertible) to logical false, and 1 to true. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability,[1] or the information that is gained when the value of such a variable becomes known.[2]
In quantum computing, a quantum bit or qubit is a quantum system that can exist in superpositionof two bit values, "true" and "false".
The symbol for bit, as a unit of information, is either simply "bit" (recommended by the ISO/IEC standard 80000-13 (2008)) or lowercase "b" (recommended by the IEEE 1541 Standard (2002))