|
| Convert from Bit to Byte |
Unit Definition (Bit) The Bit is the basic unit of the amount of data. Each bit records one of the two possible answers to a single question: "0" or "1," "yes" or "no," "on" or "off." When data is represented as binary (base-2) numbers, each binary digit is a single bit. In fact, the word "bit" was coined by the American statistician and computer scientist John Tukey (b. 1915) in 1946 as an acronym for binary digit. Somewhat more generally, the bit is used as a logarithmic unit of data storage capacity, equal to the base-2 logarithm of the number of possible states of the storage device or location. For example, if a storage location stores one letter, then it has 26 possible states, and its storage capacity is log2 26 = 4.7004 bits. Unit Definition (Byte) The Byte is a unit of information used in computer engineering. Technically the byte is a unit of addressable memory, and its size can vary depending on the machine or the computing language. However, in most contexts the byte is equal to 8 bits (or 1 octet). This means that a byte has 28 = 256 possible states. The unit was named by IBM engineer Werner Buchholz in 1956, and the 8-bit size was popularized shortly thereafter by IBM's System 360, a top-selling early mainframe computer. The spelling "byte" is used instead of "bite" in order to avoid confusion with the bit.
|
|