AMD Ryzen Threadripper 1920X Review

VRMark, 3DMark & AotS: Escalation

Test Notes

Intel released new microcode for its Skylake-X processors recently, which reduces performance in some titles and lowers the AVX offset by two bins. We also noticed far lower Turbo Boost activation thresholds, though that could separately be the result of MSI's newest BIOS. The changes likely come in response to some of the power and thermal issues we encountered during our extended testing. We consequently retested both Skylake-X processors with the newest microcode.

AMD's Ryzen gaming performance is also a moving target, though it continues to improve over time. Today's story reflects all processors re-tested with the latest chipset, BIOS, GPU drivers, and game patches. We continue seeking out the best performance possible, so in today's review, we dial in AMD's Game mode for our game benchmarks.

VRMark & 3DMark

We aren't big fans of using synthetic benchmarks to measure game performance, but 3DMark's DX11 and DX12 CPU tests provide useful insight into the amount of horsepower available to game engines.

Futuremark's VRMark test lets you gauge your system's suitability for use with the HTC Vive or Oculus Rift, even if you don't currently own an HMD. The Orange Room test is based on the suggested system requirements for current-generation HTC Vive and Oculus Rift HMDs. Futuremark defines a passing score as anything above 109 FPS.

VRMark responds well to high IPC throughput and frequency, so it's not surprising to see the 1920X benefit from a clock rate advantage over the 1950X at stock settings. Extra overclocking headroom gives the 1920X a bigger boost once we start tuning.

The Intel processors lead, but it's possible that Ryzen-specific optimizations could improve Threadripper's results.

We also tested the 1920X and 1950X in Creator mode for the threaded 3DMark DX11 and DX12 tests. Big core counts propel Threadripper to the top of our charts when it's overclocked, but Game mode halves the number of available threads, causing lackluster performance. Notably, both overclocked Threadripper models in Creator mode yield the best performance.

The Vulkan API responds exceedingly well to Threadripper's architecture, and the tuned 1920X delivers excellent DX11 single-threaded performance, particularly in game mode.

Ashes of the Singularity: Escalation

Switching the 1920X into Game mode creates a 6C/12T configuration that maximizes memory locality and eliminates die-to-die latency. But that comes at the expense of performance in parallel workloads. As such, Threadripper falls below its 8C/16T Ryzen 7 1800X counterpart in this heavily-threaded game.

We include an additional slide with test results using various settings. These numbers highlight that Game mode has a positive impact on lightly-threaded and otherwise incompatible titles, but can be a hindrance for more taxing workloads. A bit of tuning (and switching to Creator mode) pushes the Threadripper models above Intel's Core i9-7900X.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
30 comments
    Your comment
  • Aldain
    Great review as always but on the power consumption fron given that the 1950x has six more cores and the 1920 has two more they are more power efficient than the 7900x is every regard relative to the high stock clocks of the TR
  • derekullo
    "Ryzen Threadripper 1920X comes arms with 12 physical cores and SMT"

    ...

    Judging from Threadripper 1950X versus the Threadripper 1900X we can infer that a difference of 400 megahertz is worth the tdp of 16 whole threads.

    I never realized HT / SMT was that efficient or is AMD holding something back with the Threadripper 1900x?
  • jeremyj_83
    Your sister site Anandtech did a retest of Threadripper a while back and found that their original form of game mode was more effective than the one supplied by AMD. What they had done is disable SMT and have a 16c/16t CPU instead of the 8c/16t that AMD's game mode does. http://www.anandtech.com/show/11726/retesting-amd-ryzen-threadrippers-game-mode-halving-cores-for-more-performance/16
  • Wisecracker
    Hats-off to AMD and Intel. The quantity (and quality) of processing power is simply amazing these days. Long gone are the times of taking days off (literally) for "rasterizing and rendering" of work flows

    Quote:
    ...or is AMD holding something back with the Threadripper 1900x?

    I think the better question is, "Where is AMD going from here?"

    The first revision Socket SP3r2/TR4 mobos are simply amazing, and AMD has traditionally maintained (and improved!) their high-end stuff. I can't wait to see how they use those 4094 landings and massive bandwidth over the next few years. The next iteration of the 'Ripper already has me salivating :ouch:

    I'll take 4X Summit Ridge 'glued' together, please !!
  • RomeoReject
    This was a great article. While there's no way in hell I'll ever be able to afford something this high-end, it's cool to see AMD trading punches once again.
  • ibjeepr
    I'm confused.
    "We maintained a 4.1 GHz overclock"
    Per chart "Threadripper 1920X - Boost Frequency (GHz) 4.0 (4.2 XFR)"

    So you couldn't get the XFR to 4.2?
    If I understand correctly manually overclocking disables XFR.
    So your chip was just a lotto loser at 4.1 or am I missing something?

    EDIT: Oh, you mean 4.1 All core OC I bet.
  • sion126
    actually the view should be you cannot afford not to go this way. You save a lot of time with gear like this my two 1950X rigs are killing my workload like no tomorrow... pretty impressive...for just gaming, maybe......but then again....its a solid investment that will run a long time...
  • redgarl
    Now this at 7nm...
  • AgentLozen
    redgarl said:
    Now this at 7nm...


    A big die shrink like that would be helpful but I think that Ryzen suffers from other architectural limitations.

    Ryzen has a clock speed ceiling of roughly 4.2Ghz. It's difficult to get it past there regardless of your cooling method.
    Also, Ryzen experiences nasty latency when data is being shared over the Infinity Fabric. Highly threaded work loads are being artificially limited when passing between dies.
    Lastly, the Ryzen's IPC lags behind Intel's a little bit. Coupled with the relatively low clock speed ceiling, Ryzen isn't the most ideal CPU for gaming (it holds up well in higher resolutions to be fair).

    Threadripper and Ryzen only look as good as they do because Intel hasn't focused on improving their desktop chips in the last few years. Imagine if Ivy Bridge wasn't a minor upgrade. If Haswell, Broadwell, Skylake, and Kabylake weren't tiny 5% improvements. What if Skylake X wasn't a concentrated fiery inferno? Zen wouldn't be a big deal if all of Intel's latest chips were as impressive as the Core 2 Duo was back in 2006.

    AMD has done an amazing job transitioning from crappy Bulldozer to Zen. They're in a position to really put the hurt on Intel but they can't lose the momentum they've built. If AMD were to address all of these problems in their next architecture update, they would really have a monster on their hands.
  • redgarl
    Sure Billy Gates, at 1080p with an 800$ CPU and an 800$ GPU made by a competitor... sure...

    At 1440p and 2160p the gaming performances is the same, however your multi-threading performances are still better than the overprices Intel chips.
  • Ditt44
    <Moderator edit for rudeness>

    Who buys this class of CPU and bases that choice on how well it 'games'? The performance is not 'mediocre' either. It's competitive. And as for Ryzen, you do get what you pay for: Exceptional value to performance. Back under the bridge with you, please.
  • evarty
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks.

    I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.
  • Solarion
    "Of course, switching into Game mode might enable higher performance in some situations, but we don't think professional users will tolerate constant reboots to toggle back and forth."

    A professional would simply find a more palatable solution. For instance one could simply use a program like Bitsum's process lasso to force a troublesome application to run on specific cores.
  • sion126
    Well I would challenge anyone who says the multi-threading is limited on Threadripper. I compile a lot of code and the more cores and the more threads I can spread over the CPU the faster my stuff gets done.Even the Ryzen 1800X I was using before I moved to Threadripper was doing better than any i7 or i9 I tested with.

    There are for sure some annoying latency issues that seem to freeze out anything else while it loads the CPU and executes a job, but for me it is much more effective at multi core multi thread loads simply because I have more cores and more threads.

    And the thermal stuff is much better than Intel, I had MAJOR issues with the i9 packages I tried before I went this way.

    I am sure for most people average use cases Intel will be better but for mine it is not and I get a Price bonus to boot!! :)

    I mitigated by leaving one core for the rest of the system to process system requests, seems to be working well for me.
  • spdragoo
    580794 said:
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks. I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.


    I know it's been said multiple times before, to multiple people, but I guess it needs to be said again:



    If you'll consider it as proof, consider what happened when [H]ardOCP tested the Ryzen 7 & a Kaby Lake Core i7 against an old Sandy Bridge Core i7 with a GTX 1080TI (https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/1). For the games they tested, while they were able to see small differences in performance at 4K between the 3 CPUs, the key word was "small"; we're talking 1-5FPS margins on average FPS for most of them, & (except for maybe 1 game) percentage margins of only 5%. Maybe some people might call that "definitive" differences; me, I call that, "the performance is too close to call a real winner". And their margins at 1080p were the same or lower, because instead of using that powerful GTX 1080TI to limit any potential GPU limitations, they dropped down to the GTX 1060 instead...a fine GPU for 1080p gaming, but you're unable to tell with it whether it's your CPU or GPU that's limiting your performance.
  • yyk71200
    "Who pays that much for a CPU only for it to be a one-trick-pony?"
    Answer: Intel fanboys who do nothing but gaming.
  • littleleo
    Love this stuff and the review was decent too.
  • caustin582
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.
  • mitch074
    300537 said:
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.


    Pretty much all current games are optimized for 4-core Intel chips from Sandy Bridge and up (if recent benchmarks such as this one on GamersNexus are any indication). So, what to get:

    As for me, I'm waiting for RAM prices to come back down to replace my Haswell + 16 Gb: maybe by then Zen+ will be out, if not I'll just grab a R7 1700 (provided Intel doesn't come up with a full-featured, cheaper, better performing, cooler CPU by then - yeah, right).
  • D3M1G0D
    @Billy Gates

    Anybody who buys a 12-core/24-thread CPU just for playing games is an idiot. Hell, I don't even have Steam installed on my 1950X, nor do I ever intend to. I use my Threadripper system for mining and grid computing, and it is putting up dominating performance in World Community Grid. My Ryzen 7 system is for gaming.
  • rwinches
    There is no real world case for loading up the CPU for gaming when just by moving up to higher res will keep graphics load on the GPU where it belongs. 1080p testing is nice to know info but basically irrelevant, maybe if you include more multi monitor testing .

    Prime testing is also a nice to know but does not reflect real use.

    Those Price Efficiency graphs a dubious in value as they aggregate individual tests along with tests that simulate simultaneous operations. A 4c8t CPU can never be a better buy in this type of use case ever, it is simply out of it's class.
  • mitch074
    2539578 said:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?


    Well, considering both the Xbox One and PS4 contain an AMD CPU, my guess is that AMD DOES make CPU for gaming - but game makers target Intel CPUs only on PC.
    The 15-30% increase in performance for existing games that got patched after Ryzen came out seem to indicate that said game makers are revising their judgement.
  • Lieutenant Tofu
    "...and a Legacy setting to disable one CCX, solving compatibility issues."
    Shouldn't this read "disable one die", since each die is composed of 2 CCXes itself?
    Thanks for the great article!
  • Wisecracker
    Quote:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?

    I'll take 4X Raven Ridge APUs 'glued' together on Socket TR4, too ...
    :pt1cable: