Star Swarm Stress Test
Given AMD’s use of the Star Swarm demo to show how Mantle alleviates CPU dependency, we hoped to use the DirectX-based build for the opposite purpose. But our frame rate over time graph is downright frenetic. It’s hard to know whether a 300-second sample accurately pits these platforms against each other.
To be fair, Oxide Games concedes to the non-deterministic nature of its stress test. It’s the same issue we face trying to benchmark Arma 3 and Battlefield 4’s multi-player components—as soon as you involve the AI calculations needed to tax a processor, variability starts affecting the results. Removing this would shift the bottleneck back over to graphics.
Thief
The Core i7-5820K shows up at the top of another gaming chart, again followed by Core i7-4790K. Not that the results in Thief are particularly telling. All of these CPUs are fast enough to keep up with a single GeForce GTX Titan.
Tomb Raider
Tomb Raider has the -4790K on top of the -5820K, though both CPUs trail Intel’s Core i7-3970X. In reality, there’s just no way you’d be able to distinguish between any of these platforms, particularly considering their low frame time variance numbers.
World of Warcraft
WoW is another game known for exaggerating platform characteristics. And you can add it to the list of titles particularly fond of Intel’s Core i7-5820K, with the -4790K not far behind. Flip through to the frame rate over time chart, and you’ll see a tight grouping through our benchmark run.
If anything, the Core i7-5960X’s lower clock rate negatively affects its frame time variance result. The same holds true in almost every other game benchmark, too.
|
|
|
|
|
|
|
- Three New CPUs For Enthusiasts
- X99, LGA 2011-3 and DDR4: Get Ready For A Big Upgrade
- How We Tested Core i7-5960X, -5930K, And -5820K
- Synthetic Benchmarks
- Real-World Benchmarks
- Battlefield 4, Grid 2, And Metro: Last Light
- Star Swarm, Thief, Tomb Raider, And WoW
- Power, In Depth: Stock Clock Rates
- Power, In Depth: Eight and Six Cores at 3.5 GHz
- Power, In Depth: Eight and Six Cores at 4 GHz
- Power, In Depth: Eight and Six Cores at 4.5 GHz
- Power, In Depth: CPU Health at 4.8 GHz
- Measuring DDR4 Power Consumption
- Power Consumption Through Our Benchmark Suite
- Intel Keeps Enthusiasts On Its Most Modern Design With Haswell-E
1000$ is affordable to you ?
Though you have a point here, the guy buying such CPUs most likely will game at above 1080p .. but this would have implied using 2 GPUs at least in the test.
Bit disappointed to not see a comparison with the Xeon E5-1650v2(or 1660v2), as the 2600 is a bit overkill comparing prices. Some of us just need a workstation with ECC ram and not just a free-for-all(ie someone else is paying) Xeon 2600 fest.
1000$ is affordable to you ?
Though you have a point here, the guy buying such CPUs most likely will game at above 1080p .. but this would have implied using 2 GPUs at least in the test.
I have a hunch that we will never see anything like this in the comment sections of AMD reviews. Not sure why
Er, no. No it's not the first eight core processor. It is the first eight-core consumer or Core iN series processor though.
I also don't know of any unofficial 8-core processors either.
Intel Core i7-5960X, -5930K, And -5820K CPU Review: Haswell-E Rises : Read more
I was wondering how often you writers read the comments? Just wondering.
Gee. DDR4 save about 5 W with 4 modules. And i was worried of pwer consumption when i overclocked my FX 8350 at 4.7 GHz
Ya, the 5820K really stands out, especially in comparison to Intel's previous lowest SKU processors on X79. For the first time the x820 actually looks like a great option to go with. It's the same as a 3960X in clock speed and core count, except it's Haswell which seems to result in a 10-15% performance boost, and it's over $600 cheaper. The only drawback might be if you have a lot of high bandwidth PCIe cards, but I doubt that'll be an issue for most enthusiasts.
And omg that price:
http://www.microcenter.com/product/437203/Intel_Core_i7-5820k_33_GHz_LGA_2011_V3_Tray_Processor
... I love Microcenter.
THe improvement in multi-threaded workloads are good. It is the biggest improvement per generation we have seen since gulftown
I'm running a 780 ti and Gskill Ripjaw 1600 RAM.
How would the cost of said systems compare, assuming we could create them as equal as possible? Would the performance benefits of the 5820 justify the additional cost?
I'm still running on my old x58 i7 920, but it's starting to BSOD on CPU intensive games (although I suspect its my mobo that's the issue)...
I wanted to build a new system this year, but don't want to make the same mistake I did with the x58 and be left with something that simply can't be upgraded after a year or so. At the same time, I don't want to buy into old tech if that too won't last..
I have had a good run with my x58 mind, but am wary Intel may do what they did with my Gen 1 i7, and change something fundamental with the platform/DDR4 to mean I'll be 'stuck' with whatever I buy now...