Why you can trust Tom's Hardware
Test Notes
All AMD entries with "PBO" indicate an auto-overclocked configuration paired with with DDR4-3600. Intel's overclocked configurations also use DDR4-3600. Our Threadripper 2970WX sample doesn't respond well to AMD's auto-overclocking PBO feature, generating incessant BSOD's even after numerous hours of experimentation. That issue seems confined to that sample; our other Threadripper processors behave correctly with the feature enabled. As such, we can't provide benchmarks with an overclocked 2970WX, but the stock test results fall within our expectations.
We tested Far Cry 5 and Dawn of War in Game Mode with the 3970X and 3960X, but we tested the remainder of the games in the standard Creator Mode (all cores/threads active). We tested the Ryzen Threadripper 2990WX and 2770WX in game mode for all gaming tests.
We expect these benchmark deltas to shrink with higher resolutions that are more typical of the class of machines with these chips, but that is just due to a graphics-imposed bottleneck. As such, we stick to the standard FHD resolution for testing.
VRMark and 3DMark
Intel has two AMD competitors in these benchmarks: The Ryzen 9 3950X and Threadripper 2970WX, with the former delivering impressive gaming performance in games, while the latter clearly isn't as agile and has to run in game mode with a portion of its threaded horsepower left unused. This mode requires a reboot and has a varying impact on different games, so your mileage will vary. In either case, incessant reboots aren't a great selling point if you game frequently. AMD has largely addressed those concerns with third-gen Threadripper, but doesn't have a competing 3000-series chip in the -10980XE's price bracket.
The 3DMark DX12 and DX11 tests measure the amount of raw horsepower exposed by the processor to game engines, but most game engines don't scale as linearly with additional compute resources. These tests reward the -10980XE's overclocking prowess with large leads after tuning, but the Ryzen 9 3905X leads at stock settings. We can also see the impact of the -10980XE's heightened mid-range turbo boosts in the thread-friendly DX12 benchmark as it surpasses the -9980XE by a decent margin.
VRMark responds well to high per-core performance, so overclocking pushes the -10980XE to the top of the heap. The chip provides slightly more performance than its predecessor at stock settings, with the extra 300MHz of boost speed leading to a ~7fps advantage. Again, the 3950X outpaces the stock -10980XE.
Civilization VI AI and Stockfish
The Civilization VI AI test measures AI performance in a turn-based strategy game and is heavily influenced by high clock rates and instruction per cycle (IPC) throughput.
Here we see the 3950X edge out the -10980XE at stock settings, but in what will become a recurring theme in this set of benchmarks, Intel's flagship is impressive after overclocking. However, the 3950X isn't too shabby after overclocking, either. Intel's slim advantage after tuning comes at a $230 premium and requires a more robust cooling solution, so keep that in mind as you flip through the results. Meanwhile, the 2970WX isn't competitive, notching the lowest performance of the second-gen Threadripper lineup.
The open-source Stockfish AI chess engine is the polar opposite of the Civilization VI engine. This engine is designed specifically for many-core chips and scales well up to 512 cores, which is music to third-gen Threadripper's ears. Those models blast to the top of the charts, but aren't relevant in this price class. The 3950X is impressive at stock settings, and the -10980XE trails the -9980XE slightly, indicating that it isn't going to be a universal step forward. They also tie in the Civilization AI test. However, the -10980XE does expose some additional overclocking headroom that yields big gains.
AMD's explosive gen-on-gen Threadripper performance improvement, borne of a new architecture and manufacturing process, is impressive, but the 2970WX suffers at the hands of its distributed memory architecture.
Ashes of the Singularity: Escalation
Ashes of the Singularity: Escalation responds well to extra cores and threads, which benefits the Ryzen lineup. Intel's -10980XE suffers from a erratic frame latency during our test, and we tested multiple times and reinstalled the game/drivers in an attempt to rectify the issue. However, the condition is repeatable and carries over to the overclocked configuration, too. As we can see, this results in a lower 99th percentile frame rate, but that same trend applies to the W-3175X and the -9980XE. We theorize this stems from Intel's mesh architecture, present only on Intel's HEDT and data center processors, which can negatively impact performance with unoptimized software. It's also possible the issue is exacerbated by an early firmware revision for the refreshed X299 platform, or a lack of driver/game engine tuning. We also notice the -10980XE trails the -9980XE in both stock and overclocked configurations.
Overclocking helps, but the -10980XE at 4.8 GHz trails the previous-gen -9980XE at 4.6 GHz. Threadripper 2970WX trails the pack again.
Civilization VI Graphics Test
The Civilization VI graphics test finds the stock Ryzen 9 3950X delivering excellent performance given its price point. That reminds us that these HEDT processors aren't the best fit for gamers – most enthusiasts are better served by mid-range and high-end mainstream chips.
Intel's overclocking advantage comes into play once again, with the Core i9-10980XE taking a convincing lead. The 2970WX simply isn't competitive here, even when we consider its slightly lower price point.
Dawn of War III
It isn't surprising to see the overclocked Intel HEDT chips take the top of the Dawn of War III chart due to their per-core performance advantage. Whatever issues plague the -10980XE in some titles aren't a factor here: Intel's new chip takes a big step forward over the -9980XE.
Again, AMD's Ryzen 9 3950X is impressive here, but the -9900K is faster at a much lower price point, but you'll lose out on performance in threaded applications.
Far Cry 5
Intel's -9900K leads the pack, and overclocking would open an even larger divide. The -10980XE is also particularly impressive after overclocking, but trails the 3950X at stock settings. After overclocking, the 2950X uncharacteristically experiences a big jump in performance with this title, but it largely profits from its overclocked memory.
Final Fantasy XV
We run this test with the standard quality preset to sidestep the impact of a bug that causes the game engine to render off-screen objects with the higher-resolution setting.
Intel's HEDT chips flex their gaming muscle when the game engine cooperates. Here the chips take the lead across the board at both stock and overclocked settings, pushing us close to a graphics-imposed bottleneck. The Core i9-10980XE also takes a decent step forward over the -9980XE at stock settings.
The Ryzen 9 3950X trails substantially, and the 2970WX continues to be a non-factor in the gaming conversation.
Grand Theft Auto V
Grand Theft Auto V continues to be popular six long years after its release. This title favors Intel architectures and, more generally, multi-core designs with high clock rates. Intel's chips lead across the board in this title, and we spot a few significant outliers from both Threadripper 3000's and the Intel -10980XE that manifested as hitching during the benchmark sequence.
The Ryzen 9 3950X trails the stock -10980XE by 2.7 fps. Tuning the 3950X essentially yields a tie, but turning the dial to 4.8GHz on the -10980XE propels it into rarefied air.
Hitman 2
Once again, overclocking the -10980XE enables chart-topping performance and the chip also delivers a nice bump over its predecessor at stock settings. The 3950X isn't as impressive in this title, and engaging the auto-overclocking PBO feature doesn't deliver much extra performance.
Project Cars 2
Project Cars 2 is optimized for threading, but high clock rates pay off. As expected, that results in a win for Intel's overclocked processors. The stock -10890XE handily beats the 2970WX, but trails the 3950X.
The Core i9-9900K is also impressive, and overclocking it would hand it the unequivocal win in this title.
World of Tanks enCore
The power of Intel overclocking in gaming is apparent again as the -10980XE separates itself from the rest of the test pool.
MORE: Best CPUs
MORE: Intel & AMD Processor Hierarchy
MORE: All CPUs Content
Current page: Core i9-10980XE Gaming
Prev Page Boost Speeds, Power Consumption, Test Setup Next Page Core i9-10980XE SPEC Workstation and Adobe PerformancePaul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.
-
Ful4n1t0c0sme Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.Reply
And please stop using Windows to do that. -
Pat Flynn Ful4n1t0c0sme said:Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.
And please stop using Windows to do that.
While I agree that some Linux/Unix benchmarks should be present, the inclusion of gaming benchmarks helps not only pro-sumers, but game developers as well. It'll let them know how the CPU handles certain game engines, and whether or not they should waste tons of money on upgrading their dev teams systems.
Re: I used to build systems for Bioware... -
PaulAlcorn Ful4n1t0c0sme said:Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.
And please stop using Windows to do that.
Intel markets these chips at gamers, so we check the goods.
9 game benchmarks
28 workstation-focused benchmarks
40 consumer-class application tests
boost testing
power/thermal testing, including efficiency metrics
overclocking testing/data. I'm happy with that mix. -
Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?Reply
The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.
Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.
So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.
The 3970X is $3199 where I am. That will drop by $100 by 2021.
The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.
AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76
My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?
In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.
With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?
I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.
Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.
Darn it Intel is way cheaper when looked at from this perspective. -
redgarl Ful4n1t0c0sme said:Games benchmarks on a non gamer CPU. There is no sense. Please do compiling benchmarks and other stuff that make sense.
And please stop using Windows to do that.
It's over pal... done, there is not even a single way to look at it the bright way, the 3950x is making the whole Intel HEDT offering a joke.
I would have give this chip a 2 stars, but we know toms and their double standards. The only time they cannot do it is when the data is just plain dead impossible to contest... like Anandtech described, it is a bloodbath.
I don't believe Intel will get back from this anytime soon. -
PaulAlcorn IceQueen0607 said:<snip>
The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.
<snip>
The article says that "Intel still can't deal with heat".
<snip>
I agree with the first point here, which is why we point out that Intel has an advantage there for users that need the I/O.
On the second point, can you point me to where it says that in the article? I must've missed it. Taken in context, it says that Intel can't deal with the heat of adding more 14nm cores in the same physical package, which is accurate if it wants to maintain a decent clock rate. -
ezst036 I'm surprised nobody caught this from the second paragraph of the article.Reply
Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.
For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:
Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.
https://www.tomshardware.com/reviews/athlon-processor,121-16.html
Intel didn't have a response back then either. -
bigpinkdragon286
TDP is the wrong way to directly compare an Intel CPU with an AMD CPU. Neither vendor measures TDP in the same fashion so you should not compare them directly. On the most recent platforms, per watt consumed, you get more work done on the new AMD platform, plus most users don't have their chips running at max power 24/7, so why would you calculate your power usage against TDP even if it were comparable across brands?IceQueen0607 said:Disclaimer: I badly want to dump Intel and go AMD. But are the conditions right?
The AMD 3950X has 16 PCIe lanes, right? So for those of us who have multiple adapters such as RAID cards, USB or SATA port adapters, 10G NICs, etc, HEDT is the only way to go.
Someone once told me "No one in the world needs more than 16PCIe lanes, that's why mainstream CPUs have never gone over 16 lanes". If that were true the HEDT CPUs would not exist.
So we can say the 3950X destroys the Intel HEDT lineup, but only if you don't have anything other than ONE graphics card. As soon as you add other devices, you're blown.
The 3970X is $3199 where I am. That will drop by $100 by 2021.
The power consumption of 280w will cost me an extra $217 per year per PC. There are 3 HEDT PCs, so an extra $651 per year.
AMD: 1 PC @ 280w for 12 hours per day for 365 days at 43c per kilowatt hour = $527.74
Intel: 1 PC @ 165w for 12 hours per day for 365 days at 43c per kilowatt hour = $310.76
My 7900X is overclocked to 4.5GHZ all cores. Can I do that with any AMD HEDT CPU?
In summer the ambient temp here is 38 - 40 degrees Celsius. With a 280mm cooler and 11 case fans my system runs 10 degrees over ambient on idle, so 50c is not uncommon during the afternoons on idle. Put the system under load it easily sits at 80c and is very loud.
With a 280w CPU, how can I cool that? The article says that "Intel still can't deal with heat". Errr... Isn't 280w going to produce more heat than 165w. And isn't 165w much easier to cool? Am I missing something?
I'm going to have to replace motherboard and RAM too. That's another $2000 - $3000. With Intel my current memory will work and a new motherboard will set me back $900.
Like I said, I really want to go AMD, but I think the heat, energy and changeover costs are going to be prohibitive. PCIe4 is a big draw for AMD as it means I don't have to replace again when Intel finally gets with the program, but the other factors I fear are just too overwhelming to make AMD viable at this stage.
Darn it Intel is way cheaper when looked at from this perspective.
Also, your need to have all of your cores clocked to a particular, arbitrarily chosen speed is a less than ideal metric to use if speed is not directly correlated to completed work, which after all is essentially what we want from a CPU.
If you really need to get so much work done that your CPU runs at it's highest power usage perpetually, the higher cost of the power consumption is hardly going to be your biggest concern.
How about idle and average power consumption, or completed work per watt, or even overall completed work in a given time-frame, which make a better case about AMD's current level of competitiveness. -
Crashman
Fun times. The Tualatin was based on Coppermine and went to 1.4 GHz, outclassing Williamette at 1.8GHz by a wide margin. Northwood came out and beat it, but at the same time Intel was developing Pentium M based on...guess what? Tualatin.ezst036 said:I'm surprised nobody caught this from the second paragraph of the article.
Intel's price cuts come as a byproduct of AMD's third-gen Ryzen and Threadripper processors, with the former bringing HEDT-class levels of performance to mainstream 400- and 500-series motherboards, while the latter lineup is so powerful that Intel, for once, doesn't even have a response.
For twice? This is a recall of the olden days of the first-gen slot-A Athlon processors. Now I'm not well-versed in TomsHardware articles circa 1999, but this was not hard to find at all:
Coppermine's architecture is still based on the architecture of Pentium Pro. This architecture won't be good enough to catch up with Athlon. It will be very hard for Intel to get Coppermine to clock frequencies of 700 and above and the P6-architecture may not benefit too much from even higher core clocks anymore. Athlon however is already faster than a Pentium III at the same clock speed, which will hardly change with Coppermine, and Athlon is designed to go way higher than 600 MHz. This design screams for higher clock speeds! AMD is probably for the first time in the very situation that Intel used to enjoy for such a long time. AMD might already be able to supply Athlons at even higher clock rates right now (650 MHz is currently the fastest Athlon), but there is no reason to do so.
https://www.tomshardware.com/reviews/athlon-processor,121-16.html
Intel didn't have a response back then either.
And then Core came out of Pentium M, etc etc etc and it wasn't long before AMD couldn't keep up.
Ten years we waited for AMD to settle the score, and it's our time to enjoy their time in the sun. -
PaulAlcorn said:I agree with the first point here, which is why we point out that Intel has an advantage there for users that need the I/O.
On the second point, can you point me to where it says that in the article? I must've missed it. Taken in context, it says that Intel can't deal with the heat of adding more 14nm cores in the same physical package, which is accurate if it wants to maintain a decent clock rate.
yes, sorry, my interpretation was not worded accurately.
Intel simply doesn't have room to add more cores, let alone deal with the increased heat, within the same package.
My point was that Intel is still going to be easier to cool producing only 165w vs AMD's 280w.
How do you calculate the watts, or heat for an overclocked CPU? I'm assuming the Intel is still more over-clockable than the AMD, so given the 10980XE's base clock of 3.00ghz, I wonder if I could still overclock it over 4.00ghz. How much heat would it produce then compared to the AMD?
Not that I can afford to spend $6000 to upgrade to the 3970X or $5000 to upgrade to the 3960X... And the 3950X is out because of PCIe lane limitations.
It looks like I'm stuck with Intel, unless I save my coins to go AMD. Makes me sick to the pit of my stomach :)