Bit (binary)
Nat (base e)
Ban (decimal)
Qubit (quantum)
A bit or binary digit is the basic unit of information in computing and telecommunications; it is the amount of information that can be stored by a digital device or other physical system that can usually exist in only two distinct states. These may be the two stable positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
In computing, a bit can also be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the Arabic numerical digits 0 and 1. Indeed, the term "bit" is a contraction of binary digit. The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other twovalued attribute. In several popular programming languages, numeric 0 is equivalent (or convertible) to logical false, and 1 to true. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability,^{[1]} or the information that is gained when the value of such a variable becomes known.^{[2]}
In quantum computing, a quantum bit or qubit is a quantum system that can exist in superposition of two bit values, "true" and "false".
The symbol for bit, as a unit of information, is either simply "bit" (recommended by the ISO/IEC standard 8000013 (2008)) or lowercase "b" (recommended by the IEEE 1541 Standard (2002)).
Contents
Full article ▸
