|
| Convert from Megabyte (MB) to Terabyte |
Unit Definition (Megabyte (MB)) The Megabyte is a unit of information. It is very common in the computer world, but it is poorly defined. Often it means 1 000 000 bytes, but sometimes it means 220 = 1 048 576 bytes. As if that weren't confusing enough, the 1.44 megabytes stored on "high density" floppy disks are actually megabytes of 1 024 000 bytes each. This uncertainty is a major reason for the recent decision of the International Electrotechnical Commission to establish new binary prefixes for computer science. Unit Definition (Terabyte) A terabyte (derived from the prefix tera- and commonly abbreviated TB) is a measurement term for data storage capacity. The value of a terabyte based upon a decimal radix (base 10) is defined as one trillion (short scale) bytes, or 1000 gigabytes.
The number of bytes in a terabyte is sometimes stated to be approximately 1.0995 × 1012. This difference arises from a conflict between the long standing tradition of using binary prefixes and base 2 in the computer world, and the more popular decimal (SI) standard adopted widely both within and outside of the computer industry. Standards organizations such as IEC, IEEE and ISO recommend to use the alternative term tebibyte (TiB) to signify the traditional measure of 10244 bytes, or 1024 gibibytes, leading to the following definitions
|
|