AMD Ryzen Threadripper 1920X Review

Final Analysis

Ryzen Threadripper 1920X bears most of the same features touted on Threadripper 1950X. It clearly offers strong performance in threaded applications, but it also comes with higher base clock rates and more overclocking headroom than any Ryzen model we've tested. Compared to the 1950X, you save $200 in exchange for four cores and eight threads. However, you also gain higher performance in many lightly threaded productivity applications.

AMD positions Threadripper as a solution for content creators, heavy multi-taskers, and gamers who stream to services like Twitch. It also says the processors are ideal for gaming at high resolutions (a most logical pairing, given the likely specs of a desktop with an $800 CPU). The 1920X isn't intended for low-resolution gaming, particularly with lightly threaded titles. Still, we test at lower resolutions to unearth the differences between competing architectures, rather than be bound by graphics performance.

The following gaming price efficiency charts use a geometric mean of the 99th percentile frame times (a good indicator of smoothness), which we convert into an FPS measurement and plot against price. Our suite includes six games released in 2016 and five older titles that launched in 2014/2015. Threadripper’s extra cores could enable more performance in the future as software evolves to utilize them better, so we also include a chart with newer games that exploit host processing resources more thoroughly.

Ryzen Threadripper 1920X drops into the gap between Intel's $600 Core i7-7820X and $1000 i9-7900X. It offers less performance than the Intel processors in both new and older games, even after a substantial overclock. Those deltas will shrink at higher resolutions, though. The 1920X's performance is fairly comparable to the higher-end 1950X, although AMD's flagship also exhibits a relatively small lead over the 1920X in stock and overclocked configurations.

Threadripper's true value registers in more intense workloads, such as heavy multitasking while gaming. Moreover, its hefty allotment of 60 available PCIe lanes allows for plenty of expansion. Even though the X399 motherboards are quite stable, more performance-enhancing firmware is trickling out from several vendors. We've already seen much higher gaming performance from the 1950X in Game mode, which is promising. Ryzen-specific optimizations for current titles continue surfacing as well, and we expect most new games to include similar optimizations. Gaming on Ryzen should only improve with time.

Of course, we still recommend sticking with mainstream processors like Ryzen 7/5 or Core i7/i5 for the best gaming value. That recommendation applies to both Intel and AMD high-end CPUs.

Focusing more on Threadripper's core competency, the 1920X offers great performance in a few of our less demanding productivity tasks, such as the Adobe suite. Notably, the 1920X's extremely high score in Adobe Illustrator feels like an outlier, so we provide charts both with and without that test. In either case, the 1920X's frequency advantage provides more performance than Core i9-7900X in this and some other lightly threaded tasks, like decompression.

The 1920X excels in encoding and compression workloads, often matching or outstripping Intel's Core i9-7900X. The 1920X isn't as dominant in the Blender and LuxRender tests, but it delivers incredibly competitive performance, especially in light of its lower price point. It also fares well in many of our HPC and scientific workloads, highlighting its diverse capabilities.

The Threadripper processors are a solid choice for highly parallelized or simultaneous workloads. Intel still enjoys an advantage in most lightly threaded tasks. But overall, the 1920X is more competitive in these applications than the lower-frequency 1950X. Of course, switching into Game mode might enable higher performance in some situations, but we don't think professional users will tolerate constant reboots to toggle back and forth.

Intel's X299 and AMD's X399 platform costs are similar, at least by early indications. Several TR4-specific coolers have already come to market, and we expect more in the future. Surprisingly, the bundled Asetek bracket, which provides poor IHS coverage, is sufficient to attain substantial overclocks (at least by Ryzen standards). We used the bracket and a standard Thermaltake 360mm radiator to achieve a rock-solid 4.1 GHz, so cooling isn't as much of a worry here as it was with Skylake-X. Take note Intel; solder pays off.

Intel's Skylake-X models are still trickling out, so the company will have faster options soon. But they'll launch at hideous price points. Meanwhile, the 1920X slots into the $400 chasm between Core i9-7900X and i7-7820X, and it doesn't appear that Intel will have a Skylake-X processor to compete any time soon. This is a tremendous opportunity for AMD, and it's great news for anyone seeking no-compromise connectivity, competitive responsiveness in everyday apps, and superior performance per dollar in threaded software.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
30 comments
    Your comment
  • Aldain
    Great review as always but on the power consumption fron given that the 1950x has six more cores and the 1920 has two more they are more power efficient than the 7900x is every regard relative to the high stock clocks of the TR
  • derekullo
    "Ryzen Threadripper 1920X comes arms with 12 physical cores and SMT"

    ...

    Judging from Threadripper 1950X versus the Threadripper 1900X we can infer that a difference of 400 megahertz is worth the tdp of 16 whole threads.

    I never realized HT / SMT was that efficient or is AMD holding something back with the Threadripper 1900x?
  • jeremyj_83
    Your sister site Anandtech did a retest of Threadripper a while back and found that their original form of game mode was more effective than the one supplied by AMD. What they had done is disable SMT and have a 16c/16t CPU instead of the 8c/16t that AMD's game mode does. http://www.anandtech.com/show/11726/retesting-amd-ryzen-threadrippers-game-mode-halving-cores-for-more-performance/16
  • Wisecracker
    Hats-off to AMD and Intel. The quantity (and quality) of processing power is simply amazing these days. Long gone are the times of taking days off (literally) for "rasterizing and rendering" of work flows

    Quote:
    ...or is AMD holding something back with the Threadripper 1900x?

    I think the better question is, "Where is AMD going from here?"

    The first revision Socket SP3r2/TR4 mobos are simply amazing, and AMD has traditionally maintained (and improved!) their high-end stuff. I can't wait to see how they use those 4094 landings and massive bandwidth over the next few years. The next iteration of the 'Ripper already has me salivating :ouch:

    I'll take 4X Summit Ridge 'glued' together, please !!
  • RomeoReject
    This was a great article. While there's no way in hell I'll ever be able to afford something this high-end, it's cool to see AMD trading punches once again.
  • ibjeepr
    I'm confused.
    "We maintained a 4.1 GHz overclock"
    Per chart "Threadripper 1920X - Boost Frequency (GHz) 4.0 (4.2 XFR)"

    So you couldn't get the XFR to 4.2?
    If I understand correctly manually overclocking disables XFR.
    So your chip was just a lotto loser at 4.1 or am I missing something?

    EDIT: Oh, you mean 4.1 All core OC I bet.
  • sion126
    actually the view should be you cannot afford not to go this way. You save a lot of time with gear like this my two 1950X rigs are killing my workload like no tomorrow... pretty impressive...for just gaming, maybe......but then again....its a solid investment that will run a long time...
  • redgarl
    Now this at 7nm...
  • AgentLozen
    redgarl said:
    Now this at 7nm...


    A big die shrink like that would be helpful but I think that Ryzen suffers from other architectural limitations.

    Ryzen has a clock speed ceiling of roughly 4.2Ghz. It's difficult to get it past there regardless of your cooling method.
    Also, Ryzen experiences nasty latency when data is being shared over the Infinity Fabric. Highly threaded work loads are being artificially limited when passing between dies.
    Lastly, the Ryzen's IPC lags behind Intel's a little bit. Coupled with the relatively low clock speed ceiling, Ryzen isn't the most ideal CPU for gaming (it holds up well in higher resolutions to be fair).

    Threadripper and Ryzen only look as good as they do because Intel hasn't focused on improving their desktop chips in the last few years. Imagine if Ivy Bridge wasn't a minor upgrade. If Haswell, Broadwell, Skylake, and Kabylake weren't tiny 5% improvements. What if Skylake X wasn't a concentrated fiery inferno? Zen wouldn't be a big deal if all of Intel's latest chips were as impressive as the Core 2 Duo was back in 2006.

    AMD has done an amazing job transitioning from crappy Bulldozer to Zen. They're in a position to really put the hurt on Intel but they can't lose the momentum they've built. If AMD were to address all of these problems in their next architecture update, they would really have a monster on their hands.
  • redgarl
    Sure Billy Gates, at 1080p with an 800$ CPU and an 800$ GPU made by a competitor... sure...

    At 1440p and 2160p the gaming performances is the same, however your multi-threading performances are still better than the overprices Intel chips.
  • Ditt44
    <Moderator edit for rudeness>

    Who buys this class of CPU and bases that choice on how well it 'games'? The performance is not 'mediocre' either. It's competitive. And as for Ryzen, you do get what you pay for: Exceptional value to performance. Back under the bridge with you, please.
  • evarty
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks.

    I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.
  • Solarion
    "Of course, switching into Game mode might enable higher performance in some situations, but we don't think professional users will tolerate constant reboots to toggle back and forth."

    A professional would simply find a more palatable solution. For instance one could simply use a program like Bitsum's process lasso to force a troublesome application to run on specific cores.
  • sion126
    Well I would challenge anyone who says the multi-threading is limited on Threadripper. I compile a lot of code and the more cores and the more threads I can spread over the CPU the faster my stuff gets done.Even the Ryzen 1800X I was using before I moved to Threadripper was doing better than any i7 or i9 I tested with.

    There are for sure some annoying latency issues that seem to freeze out anything else while it loads the CPU and executes a job, but for me it is much more effective at multi core multi thread loads simply because I have more cores and more threads.

    And the thermal stuff is much better than Intel, I had MAJOR issues with the i9 packages I tried before I went this way.

    I am sure for most people average use cases Intel will be better but for mine it is not and I get a Price bonus to boot!! :)

    I mitigated by leaving one core for the rest of the system to process system requests, seems to be working well for me.
  • spdragoo
    580794 said:
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks. I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.


    I know it's been said multiple times before, to multiple people, but I guess it needs to be said again:



    If you'll consider it as proof, consider what happened when [H]ardOCP tested the Ryzen 7 & a Kaby Lake Core i7 against an old Sandy Bridge Core i7 with a GTX 1080TI (https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/1). For the games they tested, while they were able to see small differences in performance at 4K between the 3 CPUs, the key word was "small"; we're talking 1-5FPS margins on average FPS for most of them, & (except for maybe 1 game) percentage margins of only 5%. Maybe some people might call that "definitive" differences; me, I call that, "the performance is too close to call a real winner". And their margins at 1080p were the same or lower, because instead of using that powerful GTX 1080TI to limit any potential GPU limitations, they dropped down to the GTX 1060 instead...a fine GPU for 1080p gaming, but you're unable to tell with it whether it's your CPU or GPU that's limiting your performance.
  • yyk71200
    "Who pays that much for a CPU only for it to be a one-trick-pony?"
    Answer: Intel fanboys who do nothing but gaming.
  • littleleo
    Love this stuff and the review was decent too.
  • caustin582
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.
  • mitch074
    300537 said:
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.


    Pretty much all current games are optimized for 4-core Intel chips from Sandy Bridge and up (if recent benchmarks such as this one on GamersNexus are any indication). So, what to get:

    As for me, I'm waiting for RAM prices to come back down to replace my Haswell + 16 Gb: maybe by then Zen+ will be out, if not I'll just grab a R7 1700 (provided Intel doesn't come up with a full-featured, cheaper, better performing, cooler CPU by then - yeah, right).
  • D3M1G0D
    @Billy Gates

    Anybody who buys a 12-core/24-thread CPU just for playing games is an idiot. Hell, I don't even have Steam installed on my 1950X, nor do I ever intend to. I use my Threadripper system for mining and grid computing, and it is putting up dominating performance in World Community Grid. My Ryzen 7 system is for gaming.
  • rwinches
    There is no real world case for loading up the CPU for gaming when just by moving up to higher res will keep graphics load on the GPU where it belongs. 1080p testing is nice to know info but basically irrelevant, maybe if you include more multi monitor testing .

    Prime testing is also a nice to know but does not reflect real use.

    Those Price Efficiency graphs a dubious in value as they aggregate individual tests along with tests that simulate simultaneous operations. A 4c8t CPU can never be a better buy in this type of use case ever, it is simply out of it's class.
  • mitch074
    2539578 said:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?


    Well, considering both the Xbox One and PS4 contain an AMD CPU, my guess is that AMD DOES make CPU for gaming - but game makers target Intel CPUs only on PC.
    The 15-30% increase in performance for existing games that got patched after Ryzen came out seem to indicate that said game makers are revising their judgement.
  • Lieutenant Tofu
    "...and a Legacy setting to disable one CCX, solving compatibility issues."
    Shouldn't this read "disable one die", since each die is composed of 2 CCXes itself?
    Thanks for the great article!
  • Wisecracker
    Quote:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?

    I'll take 4X Raven Ridge APUs 'glued' together on Socket TR4, too ...
    :pt1cable: