Hi all, I've been doing some reading about networking and physical media and I was reading about different categories of UTP cable (Cat 1-6) and something I'm having a hard time understanding that the book didn't explain too well is the difference between MHz and Mbps.
For example, the book I'm reading says that Cat 5 is rated for 100MHz and then it says that Cat5e is also rated for 100MHz, but can handle gigabit ethernet. This means that Cat 5 and 5e each are 100MHz, but 5e can handle 1000Mbps while Cat 5 only does 10/100Mbps correct?
How so if both are 100MHz?
Also, one other question. The book says to use 5e as the minimum when installing cable because some cables are now certified for 350MHz or beyond which allows the cable to exceed speeds of 1Gbps. So, I'm wondering is the author referring to Cat5e cable that can do 350MHz (b/c earlier as I mentioned he said it was rated at 100MHz)?
A quick Google found the following explanation on Cat5e:
"This is the newest version of CAT5 cables, formally called ANSI/TIA/EIA 568A-5 or simply Category 5e. CAT-5e is completely backward compatible with current Category-5 equipment, and Category 5e will operate at up to 350MHZ, instead of the 100MHZ of standard CAT 5 cables. The enhanced electrical performance of CAT 5e ensures that the cable will support applications that require additional bandwidth. For applications such as Gigabit ethernet or analog video over CAT 5E cables, you should certainly consider using CAT 5e over CAT 5 cable. All aspects of performance are enhanced: capacitance, frequency, resistance, attenuation, impedance."