Graphics, Memory, And CPU
Graphics: Two Asus R9290-4GD5 In CrossFire
The choice was easy a month ago. If AMD's $400 Radeon R9 290 could roughly match the performance of Nvidia’s GeForce GTX 780, an $800 pair of 290s should outperform September’s trio of GTX 760s at $900. More performance for less money was the theory, before Hawaii-based boards shot up in price online and the increase was blamed on a crypto-currency gold rush. Now we're stuck with two $530 cards that are identical to the $400 boards we purchased a month ago. Talk about buyer's remorse.
Read Customer Reviews of Asus' R9290-4GD5 Radeon R9 290 (opens in new tab)
Choosing a brand wasn’t hard, since all of the cards we've seen in stock follow AMD's reference design. Since Chris’s best retail sample came from Asus, I went that route as well.
DRAM: G.Skill Ripjaws X F3-14900CL9Q-16GBXL
Several of our memory reviews have shown that the best gaming experience comes from DDR3-2133 with optimized timings. Those same articles showed some of our benchmarks slowing down when we used DDR3-2400, probably because motherboards offset higher data rates with relaxed timings that are more difficult to optimize.
The problem with my previous system was that its DDR3-1600 wasn’t overclockable. In fact, I had to use overclocked voltage levels just to guarantee stability at its rated settings. I’m not going to make that mistake again!
Read Customer Reviews of G.Skill's Ripjaws 16 GB DDR3 Memory Kit (opens in new tab)
G.Skill sells the same modules in various colors, with various heat spreaders (Ripjaws or Ares), and under various model numbers. I’ve been using these 4 GB DIMMs for a couple of years, and find that, while they don’t always win round-ups, at least they overclock with a fair amount of consistency. Unless there’s a problem with the motherboard or the CPU’s on-die memory controller, I expect to reach DDR3-2133 while only paying for DDR3-1866.
Formerly a good value, these modules are also 30% more expensive than the day we placed our order. Ouch!
CPU: Intel Core i7-4930K
With the same 12 MB shared L3 cache as its award-winning predecessor and the advanced architecture of its flagship sibling, Intel’s Core i7-4930K is an easy choice to replace my previous machine’s Core i7-3930K.
Read Customer Reviews of Intel's Core i7-4930K (opens in new tab)
I’m also expecting a little more forgiveness from this part when it comes to overclocking, whether that comes from lower power consumption or the 22 nm manufacturing process. A 200 MHz-higher baseline clock rate is the only actual evidence I have to support my enthusiastic expectations.
Seeing that it is impossible to break even doing bitcoin mining with GPUs, i expect sooner than later a flood of barely used cards will hit the used market.
But in the case of the Bitcoiners, there's a better method to mine, why bother with the GPUs? Seems to me they lose out no matter how they end up.
Forum members often call a machine that burns far too much energy for the amount of useful work we get out of it a "space heater". But if you compare THIS machine to an ACTUAL space heater, you can clearly see the benefit of using THIS machine RATHER than an actual space heater to heat your workspace. Let mining pools pay a portion of this winter's heating bill!
I'm completely against the CONCEPT of crypto-currency mining because they produce no USEFUL data. We're producing GARBAGE data of increasing difficulty generation-by-generation and wasting all those resources to do it. It's worse than raising cattle for the leather and throwing away the meat. It's more akin to raising cattle for photographs of the cow and throwing away the cow!
These machines might actually benefit society if they were using a program like F@H, and we'd at least have a solid argument between their cost to society and their benefit to society. Someone should have beat the bitcoin guy to the punch and developed F@H coins.
Or take a look at cloud servers. Large companies are renting out their excess computing resources during low-traffic periods. Now look at PC-based, self-serving distributed computing platforms like Skype. The per-user cost is low but the number of users is high, so hosting the program across those same "clients" makes sense.
Why don't we have companies knocking down our doors begging for our excess data resources? Someone with a great marketing plan AND excellent technical knowledge should set up a distributed computing platform that pays individuals for their contributions. Environmentalists should praise that move as reducing the number of data centers needed world-wide, but me?
I'm just trying to reduce waste. I even collect my small bits of scrap metal (broken car parts, etc) and give them away to scrap metal collectors because it costs more to take these in than these are worth. Those guys collect enough small batches to make it worth the 15-mile trip. And you don't need to be a tree hugger to see that everyone benefits from that type of effort.
If we're to believe what we're told and crypto-currency mining is to blame for retailer spikes in the highest-tier AMD cards, then I expect to see AMD make some changes in its next generation of cards, especially if AMD isn't cashing in on the rush for its cards and the price hikes are solely due to merchant mark-ups. Considering AMD's business concerns over recent years, I don't expect AMD to make any such profitability mistake ever again. Instead, I think AMD will follow nVidia's example.
When nVidia capped GPGPU performance on the majority of its cards, then went on to produce the Titan and Tesla cards without such GPGPU restriction at higher prices, I was OK with that. It meant gamers could buy cards built for gaming at a reasonable price, people who used their cards for both gaming and GPGPU-related tasks could buy a card built for both for a premium, and researchers could buy cards that were fully-optimized for GPGPU use for an even higher premium. If AMD had done that with the R9-series, we'd have quite a few more gamers sporting brand new AMD cards this holiday season.
And back to the article... Heckuva build! It's an improvement over the previous build in just about every way, with the exception of its current cost.
I figure there will be a flood of used cards on the market in three months as it gets more difficult to mine the most profitable currencies. But someone mentioned that before I responded. It would be REALLY REALLY bad for AMD to spend 6-weeks increasing production volume, only to see a flood of cheap used cards knock the market out from under their new card sales. Once again, AMD is probably doing best to stick to its plans. Nobody remembers when Intel blamed overproduction by AMD for the CPU market collapse of 1999..in fact those news articles were buried within three months. But I remember :)