
With no access to base clock settings, a locked ratio multiplier, and memory data rates fixed at 1333 MT/s, we were once again limited to pushing lower latencies as the only way to improve platform performance. Unlike our last effort, this ASRock board gives us a wealth of voltage adjustments. The memory defaulted to 1.585 V at the Auto setting. And, with just a small bump to 1.6 V, we found G.Skill's value kit stable at 7-8-7-22 1T timings.
I was also eager to see how far I could take PowerColor's retail Radeon HD 7850 above its stock 860 MHz core and 1200 MHz memory clocks. AMD's Overdrive applet offers sliders up to 1050/1450 MHz. But we also know that, last quarter, we were able to use MSI's Afterburner software to boost the voltage of our GeForce GTX 560, yielding a solid 950 MHz core and 1102.5 MHz clock rate. Hoping for the same great scalability, we chose to use something a little more enthusiast-class than Overdrive.
By default, the card's GPU voltage stuck at 1.138 V under 3D loads, and then dropped to 0.90 V in 2D mode. Messing with the memory clock rate in AMD Overdrive, MSI Afterburner, or Sapphire TriXX forced our GPU to a constant 1.138 V, regardless of what the card was doing, though. Only Asus' GPU Tweak utility could augment the GPU's voltage, enable functional core clock rates above 1050 MHz, and maintain separate 2D and 3D frequencies and voltages once overclocked.
PowerColor tunes its fan to stay very quiet. Throughout testing at stock clocks, it idled at a 20% duty cycle. At 30%, it kept the Pitcairn GPU from cresting 60 degrees while looping 3DMark11’s graphics tests. We're glad the cooler is so effective, though, because it gets pretty loud at higher rotational speeds. At an 1100 MHz core setting, and with the fan set to Auto, the GPU was running at 63 degrees with the fan at 31%. As soon as it hit 65 degrees, the fan jumped all the way to 41%. At that point, I set it to an intrusive 80%, just to figure out the core's maximum overclock.
Our GPUs was stable up to 1130 MHz at its stock voltage, and it broke past 1200 MHz after a bump to 1.205 V. That's an astounding 40% overclock! I had no intention of pushing the memory all the way to 1450 MHz, since we know there's a point where higher clocks won't translate to better performance. However, I was disappointed when artifacts started showing up at 1350 MHz. At the end of the day, we settled on a 1200 MHz core and 1310 MHz memory overclock.
Side note: ignore the driver version listed in GPU-Z. We tested using the Catalyst 12.10 build, but our screen shot was captured later, after the 12.11 build was installed.

Now, we'd never test with an overclock that required a screaming-loud fixed fan speed. But I was concerned the automatic settings would respond too slowly. So, I set up a user-defined fan profile in GPU Tweak to push more air any time the machine saw a 3D load. It took some tuning, but it worked flawlessly once everything was dialed in, idling quietly in 2D mode, ramping up to 33% shortly after entering a game, and keeping our overvolted and overclocked GPU from cresting 60 degrees between a 40 and 50% duty cycle. While my tuning may be considered loud by some folks, I prefer the stability that comes from pushing more air over an overvolted GPU compared to PowerColor’s fairly relaxed automatic settings.
- Squeezing More Bang From The Same Buck
- CPU And Cooler
- Motherboard And Memory
- Graphics Card And Hard Drive
- Case, Power Supply, And Optical Drive
- Assembling Our Budget-Oriented Box
- Limited Overclocking Strikes Again
- Test System Configuration And Benchmarks
- Benchmark Results: Synthetics
- Benchmark Results: Battlefield 3
- Benchmark Results: The Elder Scrolls V: Skyrim
- Benchmark Results: F1 2012
- Benchmark Results: Audio And Video
- Benchmark Results: Productivity
- Power Consumption And Temperatures
- Is This Our Best $500 Gamer Ever?
Exactly. Couldn't've said it better.
Linux for a gaming desktop I dont think so.
What about the Phenom II 965? It's only $75 at TigerDirect.
I think they'd be better off with a B75 motherboard, 4GB RAM and an i3-3220.
Exactly. Couldn't've said it better.
It's too expensive.
This was a hardware test. You're OS complaints are irrelevant and there's no practical difference between Home and Pro versions when it comes to simple performance tests. such as these.
Several Linux distros works pretty well with most modern popular games, just FYI. Also, getting Windows for free legally is easy if you care to do it. Dreamspark has many free versions available to college students and most people know at least one, even if by proxy. Even in the unlikelihood of not knowing any, there's still the eval copies that MS gives away for free on their own website.
I disagree. The current drivers for Windows 8 are pretty much on-par with the Windows 7 drivers. Heck, they're better than AMD's pre-Catalyst 12.6 drivers.
Meh, I would've preferred seeing at least an A8-5600K with a cheaper motherboard and memory kit or keep the same memory kit and get a cheaper case. It could have fit, IDK why Tom's didn't do it. Maybe there weren't good prices on other components at the time
Windows home still costs $100 which is still some how not part of the budget.