What And How We Tested
Here are the specifications for the aforementioned Dell/Alienware systems submitted by Qualcomm. Note that the memory and hard drive options are lower-end than Dell's original spec. However, in the context of these networking performance tests, we see no cause for concern.
|Processor||Intel Core i7-2630QM @ 2.00 GHz|
|Chipset||HM67 (Sandy Bridge)|
|Memory||8 GB Dual-Channel DDR3-1333|
|Integrated Graphics||Mobile Intel HD Graphics|
|Discreet Graphics||AMD Radeon HD 6870M|
|Storage||Western Digital 320 GB WD3200BEKT|
|WLAN 1||Intel Centrino Ultimate-N 6300 (633ANHMW)|
|WLAN 2||Killer Wireless-N 1103|
|Operating System||Microsoft Windows 7|
We also grabbed a Linksys AE2500 Dual-Band Wireless-N USB adapter. We used this on the notebook equipped with Intel's Centrino Ultimate-N 6300, disabling the internal wireless adapter whenever the USB-based controller was inserted. Our supposition going in was that the USB adapter would yield poorer performance than both internally-mounted contestants since the Qualcomm and Intel devices had the benefit of larger antennas running around the notebooks’ screens. But you never know. Linksys' offering might yield some surprises.
The specifics of our server system are fairly irrelevant, as the network connection is easily the main bottleneck. Suffice it to say that we used a 3.4 GHz Core i7-2600K on an Intel DP67BG motherboard (including a gigabit Ethernet port) with 8 GB of Corsair Vengeance DDR3-1600 and a 240 GB Patriot Wildfire SSD loaded with Windows 7 Professional 64-bit.
We then made a direct Ethernet connection between the server and a Cisco Linksys E4200 router. We chose this router primarily based on our own positive experiences with the E-series in prior networking stories, but also heard from Qualcomm that it has run tests with the same unit in its own facilities. Simply, our arrangement resembled this:
With this configuration, we ran through three different tests in three locations within the author’s home.
Location 1: Ten feet separated the router and clients with direct line of sight between them. This was across opposite sides of the same room. Note that this room also had a set of active Logitech wireless speakers, as well as an unassociated Actiontec 802.11b/g router, just to make the environment good and noisy. There were also six to eight other WLANs detected by the clients at any given time. We’re not interested in how the adapters work under ideal conditions, only in real life.
Location 2: Twenty feet separated the router and clients, with one wall between them. We moved the notebooks into an adjacent room.
Location 3: Approximately sixty feet separated the router and clients. The router was located in one upstairs corner of the house while the notebooks with in the home’s downstairs opposite corner. This location is known for its sub-par reception and represents a sort of worst-case space within the house.
Our three benchmarks were:
File transfer tests: We used two file sets here. The first was simply a single 2 GB ZIP archive. The second was a folder containing several hundred data files and documents totaling 200 MB. The purpose of the first was to ascertain a sustained throughput in order to even out any fleeting environmental RF fluctuations, while the second aimed to give a better look at the overhead impact from having to pass many files rather than only one.
PassMark PerformanceTest 7: While many people have yet to try out this thorough benchmarking suite, it’s quickly becoming one of our favorite testing Swiss Army knives. Specifically, we used the suite’s Advanced Network Test.
We ran each PerformanceTest run for 180 seconds, examining both TCP and UDP throughput. Moreover, we tested each case with both 4 KB and 16 KB block sizes to better assess the impact of varying data sizes on network performance.
Gaming Network Efficiency (GaNE): This benchmark was both designed and supplied by Qualcomm/Bigfoot Networks as a quick but effective way to test ping times and jitter. We were especially drawn to its easy graphing capabilities. As usual with vendor-supplied software, we took a hard look at this tool, wanting to make sure it wasn’t playing favorites. After considerable hours of testing, we’re confident that GaNE is above board, in part because our early tests showed the Qualcomm adapter underperforming, suffering massive latencies. As happens all too often when we communicate with a vendor during the testing process, Qualcomm followed up with a driver update that greatly reduced these problems...and forced us to start our testing from scratch.
Speaking of drivers, know that we set the aforementioned Killer Network Manager app to give GaNE the “highest” priority. We debated about this, feeling that this might be giving an unfair advantage. Finally, we decided that the Killer software is just as much a part of the product platform as the hardware, and there’s nothing keeping its competitors from offering their own software optimizations. So, we let it run.