Bits & Bytes

Bits or Bytes?

What is Bits & Bytes, Bits n Bytes or Bits and Bytes? A bit is the basic unit of information in computing and digital communications. 

A bit can have only one of two values, and may therefore be physically implemented with a two-state device.

These values are most commonly represented as either a 0 or a 1.

The term bit is a portmanteau of binary digit.

The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute.

The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.

The length of a binary number may be referred to as its bit-length.

In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.

In quantum computing, a quantum bit or qubit is a quantum system that can exist in superposition of two bit values, true and false.

Source: Wikipedia

Read more about Bits & Bytes at Stanford Edu

Back to the startpage