Page 1:Welcome To The Wi-Fi Cage Match
Page 2:Hardware And Methodology, Explained
Page 3:Hardware And Methodology, Explained (Continued)
Page 4:What Interference Looks Like
Page 5:Coverage Areas
Page 6:Benchmark Results: Close Range, No Interference
Page 7:Benchmark Results: Mid-Range, No Interference
Page 8:Benchmark Results: Mid-Range, 1 Versus 60 Clients
Page 9:Long-Range, No Interference
Page 10:Long-Range, 1 Versus 60 Clients Plus Noise
Page 11:60 Laptops: Aggregate Performance
Page 12:Five iPad 2s: Single And Aggregate Performance
Page 13:Mid-Range, iPads And Laptops Aggregate
Page 14:Airtime Fairness Under Pressure
Page 15:Wrapping Up
In Part 1, we explained what can go wrong with Wi-Fi signals and how access points can work to improve your wireless performance. It's time for a reality check. We throw six contenders against 65 clients and some hellish interference. Who's left standing?
We took a lengthy journey through the ins and outs of Wi-Fi signals in last week's Why Your Wi-Fi Sucks And How It Can Be Helped, Part 1, examining many of the factors that can both damage and improve signal performance. This week, it’s time to tie it all together in a real-world arena and let vying wireless technologies duke it out to the death—sometimes almost literally.
As we mentioned before, prior attempts to stage this sort of test failed because the results were too variable to be accurate. We regrouped, though, and came back with a new test setup that proved far more reliable and useful. In the image below, you see a panorama view of our test environment. Essentially, this is an empty office environment we filled with 60 Dell notebooks and nine iPad and iPad 2 tablets. We then picked five competing access points and their respective controllers (when applicable) and tested them in various scenarios. All told, the rental bill totaled about $15 000, and a testing team put in three heavy days of benchmarking time. You simply don’t see wireless interference testing done at this scale in the wild.
As we suggested in the first part of this story, we’re unaware of any testing ever having been done quite like this. Our objective was to test access point performance under heavy interference conditions, and from this derive some sense of how the wireless technologies we previously examined play out in the real world. If you missed our prior article, we strongly suggest reviewing it now. Otherwise, the results we explain later may not make as much sense.
In the following pages, we’ll take a look at our access point contestants, how we tested, and analyze the test results. To give you an early hint, there turns out not to be a one-size-fits-all product. Best results will vary according to the dynamics of the access point/client arrangement. Which technologies make the most sense for your situation? Keep reading!
- Welcome To The Wi-Fi Cage Match
- Hardware And Methodology, Explained
- Hardware And Methodology, Explained (Continued)
- What Interference Looks Like
- Coverage Areas
- Benchmark Results: Close Range, No Interference
- Benchmark Results: Mid-Range, No Interference
- Benchmark Results: Mid-Range, 1 Versus 60 Clients
- Long-Range, No Interference
- Long-Range, 1 Versus 60 Clients Plus Noise
- 60 Laptops: Aggregate Performance
- Five iPad 2s: Single And Aggregate Performance
- Mid-Range, iPads And Laptops Aggregate
- Airtime Fairness Under Pressure
- Wrapping Up