How fast is a gigabit? If you hear the prefix "giga" and assume 1,000 megabytes, you might also figure that a gigabit network should deliver 1,000 megabytes per second. If this sounds like a reasonable assumption to you, you’re not alone. But unfortunately, you’re going to be fairly disappointed.
So what is a gigabit? It is 1,000 megabits, not 1,000 megabytes. There are eight bits in a single byte, so let’s do the math: 1,000,000,000 bits divided by 8 bits = 125,000,000 bytes. There are about a million bytes in a megabyte, therefore a gigabit network should be capable of delivering a theoretical maximum transfer of about 125 MB/s.
While 125 MB/s might not sound as impressive as the word gigabit, think about it: a network running at this speed should be able to theoretically transfer a gigabyte of data in a mere eight seconds. A 10 GB archive could be transferred in only a minute and 20 seconds. This speed is incredible, and if you need a reference point, just recall how long it took the last time you moved a gigabyte of data back before USB keys were as fast as they are today.
Armed with this expectation, I’ll move a file over my gigabit network and check the speed to see how close it comes to 125 MB/s. We’re not using a network of wonder machines here, but we have a real-world home network with some older but decent technology.
Copying a 4.3 GB file from one of these PCs to another five different times resulted in a 35.8 MB/s average. This is only about 30% as fast as a gigabit network’s theoretical ceiling of 125 MB/s.
What’s the problem?