Convert from Megabyte (MB) to Bit

Convert what quantity?
From: To:






Result (rounded to 7 decimal places):
 
 

Unit Definition (Megabyte (MB))
The Megabyte is a unit of information. It is very common in the computer world, but it is poorly defined. Often it means 1 000 000 bytes, but sometimes it means 220 = 1 048 576 bytes. As if that weren't confusing enough, the 1.44 megabytes stored on "high density" floppy disks are actually megabytes of 1 024 000 bytes each. This uncertainty is a major reason for the recent decision of the International Electrotechnical Commission to establish new binary prefixes for computer science.

Unit Definition (Bit)
The Bit is the basic unit of the amount of data. Each bit records one of the two possible answers to a single question: "0" or "1," "yes" or "no," "on" or "off." When data is represented as binary (base-2) numbers, each binary digit is a single bit. In fact, the word "bit" was coined by the American statistician and computer scientist John Tukey (b. 1915) in 1946 as an acronym for binary digit. Somewhat more generally, the bit is used as a logarithmic unit of data storage capacity, equal to the base-2 logarithm of the number of possible states of the storage device or location. For example, if a storage location stores one letter, then it has 26 possible states, and its storage capacity is log2 26 = 4.7004 bits.


to the top
Home |  Tell a Friend |  Search |  Link to this page |  Terms |  Contact |  Help |  All conversions