Test Setup And Methodology
|Processor(s)||AMD Athlon 64 3400+ (Venice core)2.4 GHz (overclocked to 2.55 GHz), 512 kB L2 cache|
|Platform||ASrock 939Dual-SATA2 (Socket 939)ULI M1695 chipset, BIOS version 1.6|
|RAM||Patriot EP1x 1024 MB PC3500 (CL2.0-3-2-5)|
|Hard Drive||Western Digital Caviar WD1200JB120 GB, 7,200 RPM, 8 MB cache, UltraATA/100|
|Networking||On-board 100 Mbit Ethernet|
|Graphics Card||ATI Radeon X1900 XTX (PCI Express)512 MB GDDR3|
|Power Supply||Ultra X-Connect ATX, 550 W|
|System Software & Drivers|
|OS||Microsoft Windows XP Professional 5.10.2600, Service Pack 2|
|DirectX Version||9.0c (4.09.0000.0904)|
|Platform Driver||AMD Athlon 64 processor driver 220.127.116.11|
|Graphics Driver||ATI Catalyst 6.10|
A few notes about the test system and our benchmarking methodology.
All coolers were applied with Arctic Silver 3 instead of the various thermal pastes or tapes included in their packages. This was done to keep all competitors on a level playing field.
The coolers were tested in a closed Gigabyte Aurora case with stock air cooling. The Aurora comes with two silent 120 mm fans below the power supply, and one in the front bottom of the case for air intake.
Temperatures were recorded as reported from the video card's temperature sensor; this was the only way to achieve a "control". I would have preferred to use an IR thermometer, but the coolers were so vastly different in their configurations that there was no place from which to get a consistent reading across all the products; the on-die sensor had to suffice.
Load temperatures were recorded after 10 minutes of stress-testing using the freeware ATITool utility, version 0.25 beta 14. Few benchmarks will raise graphics processor temperatures more than ATITool's "scan for artifacts" option, so actual in-game temperatures will likely never hit the load levels we see in these tests.
Now with that out of the way, let's see what happened, shall we?