Test Setup And Methodology
Test Setup
Our test system (client PC) consists of the following components:
Test System Configuration | |
---|---|
Processor | Intel Pentium G3258 (3MB Cache, 3.2GHz) |
Motherboard/Platform | Gigabyte GA-Z97X-UD3H |
Memory | 8GB DDR3-1600 (2 x 4GB) |
Graphics | AMD Radeon HD 7870 |
Storage | SSD: OCZ Vertex 4 256GBHDD: Samsung F4 2000GB |
Networking | Intel Pro/1000 PT Dual-Port Server Adapter |
Power Supply | Seasonic X-520 |
Cooling | Thermalright SilverArrow SB-E Extreme |
Case | Cooler Master HAF XB |
Operating System | Windows 7 64-bit Service Pack 1 |
Network Switch | TL-SG3216 16-port GbE managed switch (LACP and jumbo frames support) |
Ethernet Cabling | CAT 6e, 2m |
As you can see, we use a capable client test system with a fast SSD from which all tests are executed. This helps to ensure there are no bottlenecks on our side, since the specific SSD can achieve up to 560 MB/s read and 510 MB/s write (sequential).
NAS Configuration | |
---|---|
Internal Disks | 4x Seagate ST500DM005 500GB (HD502HJ, SATA 6Gb/s, 7200 RPM, 16MB) |
External Disk | SSD OCZ Agility 2 60GB in USB 3.0 enclosure |
Firmware | QTS 4.1.1 |
Methodology
We use three different programs to evaluate NAS performance. The first is Intel's NAS Performance Toolkit. Our only problem with this software is that using a client PC with more than 2GB of memory heavily affects the HD Video Record and File Copy to NAS tests, since they end up measuring the client's RAM buffer speed and not the network performance. Therefore, we set the maximum memory of our test PC to 2GB via msconfig's advanced options. We also exploit the utility's batch run function, which repeats the selected tests five times and uses the averages for its final results.
The second program is custom-made. It performs 10 basic file transfer tests and measures the average speed in MB/s for each. To extract results that are as accurate as possible, we run each metric 10 times, using the average as our result.
We also perform multiple client tests (up to 10 clients are supported by one server instance) through the same program. The server utility runs on the main workstation/server, while clients run the client version. All are synchronized and operate in parallel; after all of the tests are finished, the clients report their results to the server, which sums them up and transfers them to an Excel sheet to generate the corresponding graphs.
The third program we use in our test sessions is ATTO, a well-known program for storage benchmarks. In order to use ATTO for benchmarking, we are forced to map a shared folder of the NAS to a local drive, since ATTO cannot directly access network devices.