
The best way I know of to maintain a high graphics core clock rate is to increase its power limit and fan speed. I was able to do this with the GPU set to 1100 MHz and the DRAM pushed to GDDR5-5600. Of course, you have to remember that AMD's PowerTune technology starts scaling back frequency once the Hawaii GPU hits 94 degrees and fan speed tops out at factory-defined levels.

AMD tries to keep the noise of its under-built thermal solution under control by letting its GPU get fairly hot, at which point the fan spins up to around 60% duty cycle. The fan only spins faster if it detects a thermal crisis. But those settings wouldn’t let these GPUs run near their limits.

I chose a maximum fan speed of 100% at a target maximum of 80° Celsius, even though AMD would have us believe that's overkill for a GPU that can run at maximum performance at higher temperatures.
Even though my CPU's temperatures remained low enough to prevent thermal throttling, the system would lose performance anyway after several minutes under full load. I opened the case and reached inside to find a scalding hot sink on the motherboard’s voltage regulator! Placing the left-over intake fan as a top-panel intake over the voltage regulator should have worked, but the blades whizzing past the fan grille sounded like a very loud hornet’s nest. Flipping the fan over to create exhaust helped cut the noise, but also reduced airflow over the voltage regulator. Fortunately, I kept a solution in reserve.

Voltage regulator sinks are typically designed to work in conjunction with a CPU fan. But liquid cooling moves those fans away from the processor interface. Even non-traditional gaming system makers like Lenovo recognize that problem, and have addressed it. Lacking a custom-fit solution or even so much as screw channels to secure a fan upon the X79 Extreme4’s PWM cooler, Antec’s universal SpotCool fits the build.

The case’s original intake fan remains mounted as exhaust above the motherboard, even though it’s no longer required. A little extra cooling never hurt.

As a result of my aggressive GPU fan settings, maximum system noise increased from 40.1 to 54.1 decibels at one meter after overclocking, illustrating Chris' notion that these cards either run too loud, or can't be cooled amply using the reference solution to keep them at maximum speed. At least the stock setup proves my advice about choosing the right case to muffle noise from the graphics cards.
- Making Tough Choices In Volatile Markets
- Graphics, Memory, And CPU
- Motherboard, CPU Cooling And Case
- Power Supply, SSD, Hard Drive, And Optical Drive
- Radiator Installation
- Finishing The Build
- Overclocking Through Firmware
- Final Touches
- Benchmarking Configurations
- Results: 3DMark And PCMark
- Results: SiSoftware Sandra
- Results: Battlefield 3 And Far Cry 3
- Results: F1 2012 And Skyrim
- Results: Audio And Video Encoding
- Results: Adobe Creative Suite
- Results: Productivity
- Results: File Compression
- Power, Heat, And Efficiency
- Value Conclusion
Seeing that it is impossible to break even doing bitcoin mining with GPUs, i expect sooner than later a flood of barely used cards will hit the used market.
Seeing that it is impossible to break even doing bitcoin mining with GPUs, i expect sooner than later a flood of barely used cards will hit the used market.
But in the case of the Bitcoiners, there's a better method to mine, why bother with the GPUs? Seems to me they lose out no matter how they end up.
But in the case of the Bitcoiners, there's a better method to mine, why bother with the GPUs? Seems to me they lose out no matter how they end up.
Forum members often call a machine that burns far too much energy for the amount of useful work we get out of it a "space heater". But if you compare THIS machine to an ACTUAL space heater, you can clearly see the benefit of using THIS machine RATHER than an actual space heater to heat your workspace. Let mining pools pay a portion of this winter's heating bill!
I'm completely against the CONCEPT of crypto-currency mining because they produce no USEFUL data. We're producing GARBAGE data of increasing difficulty generation-by-generation and wasting all those resources to do it. It's worse than raising cattle for the leather and throwing away the meat. It's more akin to raising cattle for photographs of the cow and throwing away the cow!
These machines might actually benefit society if they were using a program like F@H, and we'd at least have a solid argument between their cost to society and their benefit to society. Someone should have beat the bitcoin guy to the punch and developed F@H coins.
Or take a look at cloud servers. Large companies are renting out their excess computing resources during low-traffic periods. Now look at PC-based, self-serving distributed computing platforms like Skype. The per-user cost is low but the number of users is high, so hosting the program across those same "clients" makes sense.
Why don't we have companies knocking down our doors begging for our excess data resources? Someone with a great marketing plan AND excellent technical knowledge should set up a distributed computing platform that pays individuals for their contributions. Environmentalists should praise that move as reducing the number of data centers needed world-wide, but me?
I'm just trying to reduce waste. I even collect my small bits of scrap metal (broken car parts, etc) and give them away to scrap metal collectors because it costs more to take these in than these are worth. Those guys collect enough small batches to make it worth the 15-mile trip. And you don't need to be a tree hugger to see that everyone benefits from that type of effort.
If we're to believe what we're told and crypto-currency mining is to blame for retailer spikes in the highest-tier AMD cards, then I expect to see AMD make some changes in its next generation of cards, especially if AMD isn't cashing in on the rush for its cards and the price hikes are solely due to merchant mark-ups. Considering AMD's business concerns over recent years, I don't expect AMD to make any such profitability mistake ever again. Instead, I think AMD will follow nVidia's example.
When nVidia capped GPGPU performance on the majority of its cards, then went on to produce the Titan and Tesla cards without such GPGPU restriction at higher prices, I was OK with that. It meant gamers could buy cards built for gaming at a reasonable price, people who used their cards for both gaming and GPGPU-related tasks could buy a card built for both for a premium, and researchers could buy cards that were fully-optimized for GPGPU use for an even higher premium. If AMD had done that with the R9-series, we'd have quite a few more gamers sporting brand new AMD cards this holiday season.
And back to the article... Heckuva build! It's an improvement over the previous build in just about every way, with the exception of its current cost.
If we're to believe what we're told and crypto-currency mining is to blame for retailer spikes in the highest-tier AMD cards, then I expect to see AMD make some changes in its next generation of cards, especially if AMD isn't cashing in on the rush for its cards and the price hikes are solely due to merchant mark-ups. Considering AMD's business concerns over recent years, I don't expect AMD to make any such profitability mistake ever again. Instead, I think AMD will follow nVidia's example.
When nVidia capped GPGPU performance on the majority of its cards, then went on to produce the Titan and Tesla cards without such GPGPU restriction at higher prices, I was OK with that. It meant gamers could buy cards built for gaming at a reasonable price, people who used their cards for both gaming and GPGPU-related tasks could buy a card built for both for a premium, and researchers could buy cards that were fully-optimized for GPGPU use for an even higher premium. If AMD had done that with the R9-series, we'd have quite a few more gamers sporting brand new AMD cards this holiday season.
And back to the article... Heckuva build! It's an improvement over the previous build in just about every way, with the exception of its current cost.
I figure there will be a flood of used cards on the market in three months as it gets more difficult to mine the most profitable currencies. But someone mentioned that before I responded. It would be REALLY REALLY bad for AMD to spend 6-weeks increasing production volume, only to see a flood of cheap used cards knock the market out from under their new card sales. Once again, AMD is probably doing best to stick to its plans. Nobody remembers when Intel blamed overproduction by AMD for the CPU market collapse of 1999..in fact those news articles were buried within three months. But I remember
That said, I think this build really missed the mark. My current build would come fairly close to your BF3 numbers and yet my system can easily be had for around the $1300 range. While it might not compete on some of the other compute tasks, it still does pretty damn good.
Give me $2400 and I'm sure I could smoke this rig. I'd expect more from Tom's.
There is a heat/power/efficiency section but In the conclusion I would just like to see Qx vs Qy dbl and system temp along side the % performance gain for a more big picture view. Anyone else ideas on this?
"When nVidia capped GPGPU performance on the majority of its cards... It meant gamers could buy cards built for gaming at a reasonable price"
Uhm, what? Noone got a price drop, gamer cards didn't become cheaper. Reduce in power consumption ? yes, cheaper ? no.
But in the case of the Bitcoiners, there's a better method to mine, why bother with the GPUs? Seems to me they lose out no matter how they end up.
Forum members often call a machine that burns far too much energy for the amount of useful work we get out of it a "space heater". But if you compare THIS machine to an ACTUAL space heater, you can clearly see the benefit of using THIS machine RATHER than an actual space heater to heat your workspace. Let mining pools pay a portion of this winter's heating bill!
I'm completely against the CONCEPT of crypto-currency mining because they produce no USEFUL data. We're producing GARBAGE data of increasing difficulty generation-by-generation and wasting all those resources to do it. It's worse than raising cattle for the leather and throwing away the meat. It's more akin to raising cattle for photographs of the cow and throwing away the cow!
These machines might actually benefit society if they were using a program like F@H, and we'd at least have a solid argument between their cost to society and their benefit to society. Someone should have beat the bitcoin guy to the punch and developed F@H coins.
Or take a look at cloud servers. Large companies are renting out their excess computing resources during low-traffic periods. Now look at PC-based, self-serving distributed computing platforms like Skype. The per-user cost is low but the number of users is high, so hosting the program across those same "clients" makes sense.
Why don't we have companies knocking down our doors begging for our excess data resources? Someone with a great marketing plan AND excellent technical knowledge should set up a distributed computing platform that pays individuals for their contributions. Environmentalists should praise that move as reducing the number of data centers needed world-wide, but me?
I'm just trying to reduce waste. I even collect my small bits of scrap metal (broken car parts, etc) and give them away to scrap metal collectors because it costs more to take these in than these are worth. Those guys collect enough small batches to make it worth the 15-mile trip. And you don't need to be a tree hugger to see that everyone benefits from that type of effort.
Protein Folding Coins do exist.
I have a similar-model Seasonic PSU, an i3570K (stock) and a pair of the Superclocked EVGA GTX 570 in an old InWin Q500 case--I've experimented with the various fan speeds once the system had been on for a while, and the PSU fan was (subjectively) the nosiest *by far*. Even setting both blowers and the case/CPU fans to max (when the system was cold) didn't generate nearly as much noise as the PSU alone...
Such mad
So Angry
Wow
Now excuse us as we head towards the moon.