Sign in with
Sign up | Sign in
Your question

Ralationship between bit-terabyte

Tags:
Last response: in Storage
Share
October 28, 2009 8:56:41 AM

how many bits makes a 1 terabyte
a b G Storage
October 28, 2009 9:17:36 AM

The way hard drive manufacturers measure it, 1 trillion bytes make a terabyte (10^12). 1 byte is equal to 10 bits. Therefore, 1 terabyte is 8 trillion bits.

The way most operating systems measure it however, 1 TB = 2^40 bytes. This is roughly 1.0995 trillion bytes, or 8.796 trillion bits.
October 28, 2009 11:32:17 AM

its all really as misnomer caused by microsoft, linux and mac OSX use base-10 KB/MB/GB/TB

microsoft uses base 2 which should be Labeled TiB (Tebibytes)
a c 342 G Storage
October 28, 2009 2:58:32 PM

By the way, a Byte is 8 bits, not 10. When it comes to RAM space and HDD space, that's the number. (I'm ignoring some older hardware systems that actually used different values.)

The "confusion" comes from how one estimates data transmission rates using a modem, be it by phone line of ADSL, etc. Back in the telephone modem days, one of the most popular communications protocols was known as "8,N,1", short for "8 Data Bits, No Parity Bit, and One Stop Bit". That package is how each 8-bit byte of data was sent so that the receiver could check for errors in communication. So sending 8 real data bits became sending 9 actual bits. Besides the actual data bits, there usually was some non-data information sent also from time to time to manage the communication connection between two transceivers. The net result was that the actual average data transfer rate by modem often was closely estimated if you took the Bit Rate (say, 56,000 bits per second) and divided by TEN (not eight) to get the effective data transmission rate in BYTES per second.
!