AMD Ryzen Threadripper 1920X Review

Power Consumption

We establish the package’s power consumption results by using a special sensor loop. This way, our values represent the exact amount of power that goes into the CPU and then reemerges in the form of waste heat dissipated by the cooling subsystem. We check our sensor readings using shunts and by measuring overall power consumption directly at the EPS connector (with a current probe and direct voltage measurement).

AMD’s Threadripper CPUs use different partial voltages for the SoC and SMU rails at different clock rates. These partial voltages, which, again, vary based on frequency, do influence the package’s power consumption. AMD recommended that we use the profile included with its DDR4-3200 kit. But if we instead use the standard SPD values for DDR4-2133, our power measurement is 15W lower!

Both of AMD’s CPUs are designed for a maximum power ceiling of 180W at their default settings. If the memory gets overclocked, the CPU has 15 fewer watts to work with. This could affect performance in workloads that utilize all cores and, consequently, get too close to the limit.

Idle Power Consumption

Threadripper’s idle power consumption is roughly twice that of the Ryzen 7 models. However, Threadripper also hosts two dies instead of one, and it also hits higher clock rates under sporadic loads. The overclocked version requires higher voltages as well, and memory also plays a role in power consumption. For instance, dropping to DDR4-2133 pulls the 1920X's idle power use down to 32W.

CAD Workload Power Consumption

AutoCAD 2016 rarely uses more than two or three cores. In fact, most of the time it's limited to a single core. Thus, it's not surprising that the CAD power consumption only adds a maximum of 15W to the idle power numbers. The two overclocked configurations add another 14W, which makes for an almost 30W difference compared to our idle power consumption results.

Gaming Power Consumption

When it comes to gaming, Threadripper’s MCM design causes its many cores to get in each others' way. Thus, the frame rates we report end up lower than competing processors. But power consumption ends up similar to Intel's Core i9-7900X, even though Skylake-X offers much more performance.

Stress Test & Maximum Power Consumption

Power consumption goes through the roof during our stress test, especially for the overclocked configurations.

The motherboard is partially to blame for the stock Intel Core i9-7900X's excessively high numbers. It doesn’t obey the standard Turbo Boost frequency thresholds, instead boosting aggressively and staying in those boost states longer than required. For more details, see our article about the power and thermal issues we encountered during our extended testing.

Threadripper doesn’t have those kinds of issues. Asus X399 ROG Zenith Extreme limits power consumption to exactly 180W at stock settings, just as it should.

At a respectable 1.425V, the Ryzen Threadripper 1920X reaches 4.1 GHz. The higher-end 1950X needs 1.35V to achieve 3.9 GHz. Once overclocked, AMD’s new processors join Intel's Core i9-7900X overclocked to 4.5 GHz in the stratosphere beyond 300W.

In the end, Threadripper's two dies sometimes consume more power than other processors’ single dies, depending on the task. We succeeded in breaking the 4 GHz barrier by overclocking the 1920X to 4.1 GHz. At that speed, all 24 threads were fully functional and at our disposal. The high power consumption is acceptable if it's accompanied by comparably elevated application performance. For Threadripper, that requires highly parallelized workloads (and perhaps optimized software).

Unfortunately, Threadripper's efficiency during gaming turns out to be significantly worse than Intel’s. Threadripper draws an additional ~15W at idle due to the memory. Subtracting that 15W from AMD's gaming power consumption changes the picture, bringing power consumption in line with the lower gaming performance.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

This thread is closed for comments
30 comments
    Your comment
  • Aldain
    Great review as always but on the power consumption fron given that the 1950x has six more cores and the 1920 has two more they are more power efficient than the 7900x is every regard relative to the high stock clocks of the TR
  • derekullo
    "Ryzen Threadripper 1920X comes arms with 12 physical cores and SMT"

    ...

    Judging from Threadripper 1950X versus the Threadripper 1900X we can infer that a difference of 400 megahertz is worth the tdp of 16 whole threads.

    I never realized HT / SMT was that efficient or is AMD holding something back with the Threadripper 1900x?
  • jeremyj_83
    Your sister site Anandtech did a retest of Threadripper a while back and found that their original form of game mode was more effective than the one supplied by AMD. What they had done is disable SMT and have a 16c/16t CPU instead of the 8c/16t that AMD's game mode does. http://www.anandtech.com/show/11726/retesting-amd-ryzen-threadrippers-game-mode-halving-cores-for-more-performance/16
  • Wisecracker
    Hats-off to AMD and Intel. The quantity (and quality) of processing power is simply amazing these days. Long gone are the times of taking days off (literally) for "rasterizing and rendering" of work flows

    Quote:
    ...or is AMD holding something back with the Threadripper 1900x?

    I think the better question is, "Where is AMD going from here?"

    The first revision Socket SP3r2/TR4 mobos are simply amazing, and AMD has traditionally maintained (and improved!) their high-end stuff. I can't wait to see how they use those 4094 landings and massive bandwidth over the next few years. The next iteration of the 'Ripper already has me salivating :ouch:

    I'll take 4X Summit Ridge 'glued' together, please !!
  • RomeoReject
    This was a great article. While there's no way in hell I'll ever be able to afford something this high-end, it's cool to see AMD trading punches once again.
  • ibjeepr
    I'm confused.
    "We maintained a 4.1 GHz overclock"
    Per chart "Threadripper 1920X - Boost Frequency (GHz) 4.0 (4.2 XFR)"

    So you couldn't get the XFR to 4.2?
    If I understand correctly manually overclocking disables XFR.
    So your chip was just a lotto loser at 4.1 or am I missing something?

    EDIT: Oh, you mean 4.1 All core OC I bet.
  • sion126
    actually the view should be you cannot afford not to go this way. You save a lot of time with gear like this my two 1950X rigs are killing my workload like no tomorrow... pretty impressive...for just gaming, maybe......but then again....its a solid investment that will run a long time...
  • redgarl
    Now this at 7nm...
  • AgentLozen
    redgarl said:
    Now this at 7nm...


    A big die shrink like that would be helpful but I think that Ryzen suffers from other architectural limitations.

    Ryzen has a clock speed ceiling of roughly 4.2Ghz. It's difficult to get it past there regardless of your cooling method.
    Also, Ryzen experiences nasty latency when data is being shared over the Infinity Fabric. Highly threaded work loads are being artificially limited when passing between dies.
    Lastly, the Ryzen's IPC lags behind Intel's a little bit. Coupled with the relatively low clock speed ceiling, Ryzen isn't the most ideal CPU for gaming (it holds up well in higher resolutions to be fair).

    Threadripper and Ryzen only look as good as they do because Intel hasn't focused on improving their desktop chips in the last few years. Imagine if Ivy Bridge wasn't a minor upgrade. If Haswell, Broadwell, Skylake, and Kabylake weren't tiny 5% improvements. What if Skylake X wasn't a concentrated fiery inferno? Zen wouldn't be a big deal if all of Intel's latest chips were as impressive as the Core 2 Duo was back in 2006.

    AMD has done an amazing job transitioning from crappy Bulldozer to Zen. They're in a position to really put the hurt on Intel but they can't lose the momentum they've built. If AMD were to address all of these problems in their next architecture update, they would really have a monster on their hands.
  • redgarl
    Sure Billy Gates, at 1080p with an 800$ CPU and an 800$ GPU made by a competitor... sure...

    At 1440p and 2160p the gaming performances is the same, however your multi-threading performances are still better than the overprices Intel chips.
  • Ditt44
    <Moderator edit for rudeness>

    Who buys this class of CPU and bases that choice on how well it 'games'? The performance is not 'mediocre' either. It's competitive. And as for Ryzen, you do get what you pay for: Exceptional value to performance. Back under the bridge with you, please.
  • evarty
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks.

    I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.
  • Solarion
    "Of course, switching into Game mode might enable higher performance in some situations, but we don't think professional users will tolerate constant reboots to toggle back and forth."

    A professional would simply find a more palatable solution. For instance one could simply use a program like Bitsum's process lasso to force a troublesome application to run on specific cores.
  • sion126
    Well I would challenge anyone who says the multi-threading is limited on Threadripper. I compile a lot of code and the more cores and the more threads I can spread over the CPU the faster my stuff gets done.Even the Ryzen 1800X I was using before I moved to Threadripper was doing better than any i7 or i9 I tested with.

    There are for sure some annoying latency issues that seem to freeze out anything else while it loads the CPU and executes a job, but for me it is much more effective at multi core multi thread loads simply because I have more cores and more threads.

    And the thermal stuff is much better than Intel, I had MAJOR issues with the i9 packages I tried before I went this way.

    I am sure for most people average use cases Intel will be better but for mine it is not and I get a Price bonus to boot!! :)

    I mitigated by leaving one core for the rest of the system to process system requests, seems to be working well for me.
  • spdragoo
    580794 said:
    Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks. I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.


    I know it's been said multiple times before, to multiple people, but I guess it needs to be said again:



    If you'll consider it as proof, consider what happened when [H]ardOCP tested the Ryzen 7 & a Kaby Lake Core i7 against an old Sandy Bridge Core i7 with a GTX 1080TI (https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/1). For the games they tested, while they were able to see small differences in performance at 4K between the 3 CPUs, the key word was "small"; we're talking 1-5FPS margins on average FPS for most of them, & (except for maybe 1 game) percentage margins of only 5%. Maybe some people might call that "definitive" differences; me, I call that, "the performance is too close to call a real winner". And their margins at 1080p were the same or lower, because instead of using that powerful GTX 1080TI to limit any potential GPU limitations, they dropped down to the GTX 1060 instead...a fine GPU for 1080p gaming, but you're unable to tell with it whether it's your CPU or GPU that's limiting your performance.
  • yyk71200
    "Who pays that much for a CPU only for it to be a one-trick-pony?"
    Answer: Intel fanboys who do nothing but gaming.
  • littleleo
    Love this stuff and the review was decent too.
  • caustin582
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.
  • mitch074
    300537 said:
    So unless you're mainly concerned with Blender, Handbrake, or advanced math/science programs, just get a 7700K.


    Pretty much all current games are optimized for 4-core Intel chips from Sandy Bridge and up (if recent benchmarks such as this one on GamersNexus are any indication). So, what to get:

    As for me, I'm waiting for RAM prices to come back down to replace my Haswell + 16 Gb: maybe by then Zen+ will be out, if not I'll just grab a R7 1700 (provided Intel doesn't come up with a full-featured, cheaper, better performing, cooler CPU by then - yeah, right).
  • D3M1G0D
    @Billy Gates

    Anybody who buys a 12-core/24-thread CPU just for playing games is an idiot. Hell, I don't even have Steam installed on my 1950X, nor do I ever intend to. I use my Threadripper system for mining and grid computing, and it is putting up dominating performance in World Community Grid. My Ryzen 7 system is for gaming.
  • rwinches
    There is no real world case for loading up the CPU for gaming when just by moving up to higher res will keep graphics load on the GPU where it belongs. 1080p testing is nice to know info but basically irrelevant, maybe if you include more multi monitor testing .

    Prime testing is also a nice to know but does not reflect real use.

    Those Price Efficiency graphs a dubious in value as they aggregate individual tests along with tests that simulate simultaneous operations. A 4c8t CPU can never be a better buy in this type of use case ever, it is simply out of it's class.
  • mitch074
    2539578 said:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?


    Well, considering both the Xbox One and PS4 contain an AMD CPU, my guess is that AMD DOES make CPU for gaming - but game makers target Intel CPUs only on PC.
    The 15-30% increase in performance for existing games that got patched after Ryzen came out seem to indicate that said game makers are revising their judgement.
  • Lieutenant Tofu
    "...and a Legacy setting to disable one CCX, solving compatibility issues."
    Shouldn't this read "disable one die", since each die is composed of 2 CCXes itself?
    Thanks for the great article!
  • Wisecracker
    Quote:
    why cant amd make a cpu for gaming? why all this junk thats useless to me?

    I'll take 4X Raven Ridge APUs 'glued' together on Socket TR4, too ...
    :pt1cable: