Convert from Bit to Terabyte

Convert what quantity?
From: To:






Result (rounded to 7 decimal places):
 
 

Unit Definition (Bit)
The Bit is the basic unit of the amount of data. Each bit records one of the two possible answers to a single question: "0" or "1," "yes" or "no," "on" or "off." When data is represented as binary (base-2) numbers, each binary digit is a single bit. In fact, the word "bit" was coined by the American statistician and computer scientist John Tukey (b. 1915) in 1946 as an acronym for binary digit. Somewhat more generally, the bit is used as a logarithmic unit of data storage capacity, equal to the base-2 logarithm of the number of possible states of the storage device or location. For example, if a storage location stores one letter, then it has 26 possible states, and its storage capacity is log2 26 = 4.7004 bits.

Unit Definition (Terabyte)
A terabyte (derived from the prefix tera- and commonly abbreviated TB) is a measurement term for data storage capacity. The value of a terabyte based upon a decimal radix (base 10) is defined as one trillion (short scale) bytes, or 1000 gigabytes. The number of bytes in a terabyte is sometimes stated to be approximately 1.0995 × 1012. This difference arises from a conflict between the long standing tradition of using binary prefixes and base 2 in the computer world, and the more popular decimal (SI) standard adopted widely both within and outside of the computer industry. Standards organizations such as IEC, IEEE and ISO recommend to use the alternative term tebibyte (TiB) to signify the traditional measure of 10244 bytes, or 1024 gibibytes, leading to the following definitions


to the top
Home |  Tell a Friend |  Search |  Link to this page |  Terms |  Contact |  Help |  All conversions