Ralationship between bit-terabyte

The way hard drive manufacturers measure it, 1 trillion bytes make a terabyte (10^12). 1 byte is equal to 10 bits. Therefore, 1 terabyte is 8 trillion bits.

The way most operating systems measure it however, 1 TB = 2^40 bytes. This is roughly 1.0995 trillion bytes, or 8.796 trillion bits.
 

Paperdoc

Polypheme
Ambassador
By the way, a Byte is 8 bits, not 10. When it comes to RAM space and HDD space, that's the number. (I'm ignoring some older hardware systems that actually used different values.)

The "confusion" comes from how one estimates data transmission rates using a modem, be it by phone line of ADSL, etc. Back in the telephone modem days, one of the most popular communications protocols was known as "8,N,1", short for "8 Data Bits, No Parity Bit, and One Stop Bit". That package is how each 8-bit byte of data was sent so that the receiver could check for errors in communication. So sending 8 real data bits became sending 9 actual bits. Besides the actual data bits, there usually was some non-data information sent also from time to time to manage the communication connection between two transceivers. The net result was that the actual average data transfer rate by modem often was closely estimated if you took the Bit Rate (say, 56,000 bits per second) and divided by TEN (not eight) to get the effective data transmission rate in BYTES per second.