AMD Ryzen 5 2600X Review: Spectre Patches Weigh In

AMD's 2000-series Ryzen CPUs are already available, challenging the Coffee Lake-based Core line-up from Intel. As we found in our Ryzen 7 2700X review, a host of improvements made possible by 12nm manufacturing, such as higher frequencies and Precision Boost 2, add more performance in threaded apps. Meanwhile, lower system memory and cache latencies augment AMD's showing in lightly-threaded apps like games. Unlocked multipliers, backward compatibility with older Socket AM4 motherboards, a beefy bundled cooler, and a $330 price tag combine to leave us impressed. The Ryzen 7 2700X offers a great alternative to Intel's Core i7-8700K, which costs more, doesn't come with a thermal solution, and drops into more expensive motherboards (at least if you want to overclock).

Similarly, Ryzen 5 2600X targets Intel's enthusiast-oriented Core i5-8600K, leveraging similar advancements and a more attractive $230 price tag. As we'll see, it's even faster than the first-gen flagship Ryzen 7 1800X in many workloads.

But First, Spectre Variant 2

Unfortunately, due to a lack of communication from AMD, we weren't told that the company had rolled its Spectre Variant 2 patch into shipping X470 platforms. As a result, our Ryzen 7 2700X launch day coverage didn't include Intel CPUs tested with their corresponding patches. Today's review does, however, feature results generated on Intel-based systems with the latest Spectre microcode updates.

Ryzen 5 2600X

Ryzen 2000-series processors, otherwise known by their Pinnacle Ridge code name, are based on the same basic Zen core design as previous-gen models (though AMD now uses Zen+ nomenclature to reference the architecture's various improvements). The CPUs still utilize a dual-CCX configuration, tied together with Infinity Fabric, yielding eight physical cores. The flagship Ryzen 7 2700X comes with all eight of its cores active. For Ryzen 5 2600X, AMD turns two off, creating a six-core, 12-thread configuration with an unlocked ratio multiplier.

As mentioned, Ryzen 5 2600X sells for $230, replacing the $220 Ryzen 5 1600X. It slots into the gap between Core i5-8600K and the Core i5-8400, forcing the chip to contend with Intel's recently-announced Core i5-8600. While we don't have that model in our lab yet, we do have the two nearest Coffee Lake-based competitors in today's benchmark charts.

What do you get, performance-wise, for the extra $10? Ryzen 5 2600X sports the same 3.6 GHz base clock rate and a slightly higher 4.2 GHz Precision Boost 2 frequency (+200 MHz) than 1600X. That might seem minor, but as our benchmarks show, the gains are quite pronounced in threaded workloads. Like its predecessor, the 2600X also features 16MB of L3 cache and a 95W TDP.


AMD Ryzen 7 2700X
AMD Ryzen 7 1800X
AMD Ryzen 7 2700
AMD Ryzen 5 1600X
Ryzen 5 1600
AMD Ryzen 5 2600X
AMD Ryzen 5 2600
Intel Core i5-8600K
Intel Core i5-8600
Intel Core i5-8400
MSRP
$329
$349
$299
$219
$189
$229
$199
$257
$224
$182
Cores/Threads
8/16
8/16
8/16
6/12
6/12
6/12
6/12
6/6
6/6
6/6
TDP
105W
95W
65W
95W
65W
95W
65W
95W
65W
65W
Base Freq. (GHz)
3.7
3.6
3.2
3.6
3.2
3.6
3.4
3.6
3.1
2.8
Precision Boost Freq. (GHz)
4.3
4.1
4.14.0
3.6
4.2
3.9
4.3
4.3
4.0
Cache (L3)
16MB
16MB
16MB
16MB
16MB
16MB
16MB
9MB
9MB
9MB
Unlocked Multiplier
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
No
No
Cooler
105W Wraith Prism (LED)
-
95W Wraith Spire (LED)-
95W Wraith Spire
95W Wraith Spire
65W Wraith Stealth
-
Intel
Intel

Although AMD didn't include thermal solutions with its original Ryzen X-series processors, the company does bundle coolers with its pricier models now. On one hand, it's nice that the 95W Wraith Spire cooler neatly matches the 2600X's thermal design power. On the other, we're not expecting much overclocking headroom from the combination.

Ryzen 5 2600X can drop into either new X470 or older 300-series motherboards. As usual, AMD allows you to overclock on value-minded B-series boards, too. And even though 400-series B-models aren't available yet, they'll undoubtedly offer a lower-priced alternative for overclocking.

Officially, the Ryzen 5 2600X supports up to DDR4-2933 memory, just like Ryzen 7 2700X. This trumps Coffee Lake's Intel-specified DDR4-2666 ceiling (with a few caveats). AMD also sticks with Indium solder between Ryzen 5's die and heat spreader, improving thermal transfer performance. And as we mentioned in our Ryzen 7 2700X review, these new CPUs also include StorMI Technology, which is a software-based tiering solution that blends the low price and high capacity of a hard drive with the speed of an SSD, 3D XPoint (including Intel's Optane parts), or even up to 2GB of RAM.

Precision Boost 2 and XFR2

In a nutshell, AMD is leveraging GlobalFoundries' 12nm process to enhance its design, rather than shrink it. The enhancements offer higher performance or lower power consumption at any given frequency, giving AMD headroom for other improvements.

The company's previous-gen Ryzen processors have Precision Boost, which is similar to Intel's Turbo Boost technology, and eXtended Frequency Range (XFR), capable of delivering a frequency uplift when your cooling solution has thermal headroom to spare.

The new Precision Boost 2 (PB2) and XFR2 algorithms improve performance in threaded workloads by raising the frequency of any number of cores. AMD doesn't share a list of specific multi-core Precision Boost 2 and XFR2 bins because the opportunistic algorithms accelerate to different clock rates based on temperature, current, and load.

AMD gave us a graph of the PB2 frequencies for Ryzen 7 2700X, but we followed up with our own measurements to compare the current and previous-gen Ryzen 5 models. As you can see, Ryzen 5 2600X offers more robust multi-core frequencies than its predecessor, and our Ryzen 7 2700X measurements largely mirror AMD's. We tested both CPUs with AMD's Precision Boost Overdrive active. The Ryzen 7 2700X does have a higher TDP rating that some older motherboards may struggle with, so PB2 performance will vary based upon the power delivery subsystem.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

Create a new thread in the US Reviews comments forum about this subject
42 comments
Comment from the forums
    Your comment
  • bbertram99
    HPET.....
  • nitrium
    Anonymous said:
    HPET.....

    ??? Yes it destroys Intel's performance (not AMD's), but it's off by default in Windows and there is no reason to force it on.
  • bbertram99
    i was wondering if reviews will now be posting if they checked if it was on or not. I think they should since we don't always know when it being forced. Its not evident until you look. If they don't state its not forced on then we are left wondering.
  • toyo
    What's the point of this? Where's the GTX 1080ti? The 1080 simply result in every CPU being able to feed it enough data so scores are almost similar. Hence anomalies like having the 8400 or 8600k often being better than the 8700K, which should be impossible considering the higher clocks. I mean, this CPU performed the best for 3-4 months, even after Meltdown/Spectre patches/BIOS, and now it suddenly has issues competing with its own family of CPUs that are half that price, really?
    Then there's the gaming suite chosen. Old Far Cry? The 5th is out. Where's AC: Origins, notoriously CPU hungry? Overwatch? FFXV?
    Hell, even the older Deus Ex or Kingdom Come: Deliverance would have made more sense to test CPUs.
    But yes, this shows that for anything below 1080ti, you're good with pretty much all of these CPUs. Yet it doesn't tell the whole story, and soon a new GPU generation will be released, probably introducing many here to GTX 1080ti levels of performance, so testing with it does make sense.
  • PaulAlcorn
    Anonymous said:
    i was wondering if reviews will now be posting if they checked if it was on or not. I think they should since we don't always know when it being forced. Its not evident until you look. If they don't state its not forced on then we are left wondering.


    HPET has been disabled by default in Windows for a decade or so now. The OS can call on HPET if it needs it. The performance overhead of HPET is a known quantity, which is why the OS doesn't use it if possible.

    We test without HPET disabled, which is enforced by our test scripts to ensure it stays that way.
  • nitrium
    Anonymous said:
    Yet it doesn't tell the whole story, and soon a new GPU generation will be released, probably introducing many here to GTS 1080ti levels of performance, so testing with it does make sense.

    Agreed. I'm still using an i5 760 (@.3.4GHz) which was released in July 2010. I have had multiple GPU upgrades over the years (as of this moment I'm on an R9 390), so I also would very much like to know if a new CPU is as "future proof" as possible with regards to GPU upgrades.
  • kilgor98
    Would the 8700k cost what it does today without Ryzen? They would still be feeding us quad cores on the same 14nm process.
  • bbertram99
    Don't see how to quote you PAULALCORN.

    Considering Anandtech got caught by the HPET bug and you never see it mentioned in any reviews until now. So now i question each review I have seen and will see unless in mentioned. The credibility of all benchmarks are in question unless it clear HPET is disabled. Good thing you script handles that, thank you for let me know.

    Keep on providing great content, I've loved Tom's reviews for a LONG time.
  • btmedic04
    so can we expect updated benchmarks in the 2700x's review being that the results are skewed by the lack of specter patches on the intel processors?
  • PaulAlcorn
    Anonymous said:
    Don't see how to quote you PAULALCORN.

    Considering Anandtech got caught by the HPET bug and you never see it mentioned in any reviews until now. So now i question each review I have seen and will see unless in mentioned. The credibility of all benchmarks are in question unless it clear HPET is disabled. Good thing you script handles that, thank you for let me know.

    Keep on providing great content, I've loved Tom's reviews for a LONG time.


    No one mentions HPET because it is disabled by default in the OS. If we listed every single feature that we leave alone and do not alter...that would be a long list :)
  • ingtar33
    Except you're wrong Paulalcorn.

    HPET is ON be default in the Bios, and it's treated on a case by case basis by the OS. Generally it's off, but if it's available and software asks for it, it's allowed. What bit anandtech was they KNEW of the OS's unpredictable treatment of HPET, and in order to "standardize" their results they were forcing HPET ON for everything at the OS level. This is good scientific process, you have a variable (a feature like HPET) which is used by the OS in a non-controlled way and has been shown in the past to negatively (or positively) affect performance, you simply turn it into a constant.

    The problem is HEPT, since the Specter2 patch is now HAMMERING intel performance in a wide range of benches. I don't agree with ANANDTECH's decision to no longer "force" HEPT. Not because it helps INTEL, but because it was good science to FORCE it on, because of how the OS and software treat it and apply it being sort of case by case. anandtech and all other reviewers should either test with it forced off for everything or force it on; and the reviews should make it clear which way their system was tested. Leaving it on but not forced only makes their results less consistent, and is probably the worse way to address the problem.
  • kmazurek
    Why aren't you checking the Ryzen 7 2700. I'm very interested in the gap between the 2700 and 2700X. If you plan a quieter setup how much do you loose on this change? I'm missing the 2700 on those charts.
  • toyo
    Anonymous said:
    I don't agree with ANANDTECH's decision to no longer "force" HEPT. Not because it helps INTEL, but because it was good science to FORCE it on, because of how the OS and software treat it and apply it being sort of case by case. anandtech and all other reviewers should either test with it forced off for everything or force it on; and the reviews should make it clear which way their system was tested. Leaving it on but not forced only makes their results less consistent, and is probably the worse way to address the problem.

    What "case by case"? Give me ONE example where HPET is enabled by the OS automatically depending on an application. Just one. You know it takes a reboot for that to happen? How is that normal OS behavior when you require reboots to enable/disable features depending on user activity? It's not and it never happens.

    I've personally had the HPET On from BIOS, but it was never active in Windows for 6 months now. I had to force it to see its effects, and I'll be damned, it's a disaster for my 8700K. And NO, Spectre/meltdown patches do not matter, this is true even before them being applied.

    It's incredibly dumb to force a Windows flag that is by default Always Off, especially when it has a dramatic performance impact. How many Intel users in this whole world force HPET on when it makes games unplayable? Not even an exaggeration.

    So why in the world a reviewer would force it, when it has VISIBLE framerate drops and hiking in games? Why would you tank performance intentionally for half your CPUs?

    Yeah, I thought so,
  • msroadkill612
    I love scatter plots, and wow, 2600x sure dominates the excellent ones on the last page.

    It sits down there in the corner in ~every category, with little much to the right of it, at any price above.

    What a bargain?

    It has the makings of a popular classic.

    Its no longer version 1.0 of the radical zen ryzen architecture and platform. 12nm, meh, its a great chance for a good tidy up and a harvest of low hanging fruit.

    AMD will always be a bit hungry for those extra 2 cores, so 6 cores will remain attractively, relatively cheaper.

    Big picture/strategically, they show a seriously troubled intel.

    For me it's moot. AM4 is a better platform imo. Vitally, most intels with a proper dgpu, give you exactly zero spare io bandwidth beyond the 4GB/s shared by the chipset, with it's overheads also.

    A single nvme (960 Pro e.g.) would be too fast for it / could saturate it. Other chipset connected resources like sata/network/usb ... would be choked.

    AMD is niggardly also, but adds another 4GB/s, or as much as the entire chipset again.

    A strategically utilised nvme on those extra amd lanes, relieves the chipset of its major drain on it's limited bandwidth - the main system drive.

    An am4 w/ 16GB ddr4 = ~40GB/s (40,000MB/s) bandwidth - thats the max the system can offer.

    16 lane pcie3 gpu link = 16GB/s in theory

    Then we have, still mainstream, storage like:

    sata ssd = ~450GB/s (550MB/s in theory)

    sata hdd - ~50-150MB/s

    Quite a gap, no? It is the slowest important resource by a huge margin, and inevitably hindering their performance in many subtle ways.

    Yet when affordable mainstream nvme offers a chance to strenghthen this grave weakness (like 3000MB/s sequential reads), folks buy intel & effectively turn their backs on nvme's real power.

    Its surreal. Folks are offered an affordable option for their slowest component, thats ~6x faster, & they not only pass, they close future options for want of bandwidth and lanes on their intel platform.
  • msroadkill612
    "mitigation
    mɪtɪˈɡeɪʃ(ə)n/Submit
    noun
    the action of reducing the severity, seriousness, or painfulness of something."

    It doent actually mean "fix" :)
  • msroadkill612
    Superficially it seems OT, but comparison would reveal a lot, if it included AMD's zen vega apu product, which deviates from the zen norm of pairs of CCX, and the associated inter ccx latency.

    Its amdS only true zen equivalent to intel's 4 core cpuS afaik.
  • ITFT
    Can anyone share which Windows KB is needed to patch Spectre/Meltdown on Win 7 - 64 Bit? Thank you.
  • randomizer
    Anonymous said:
    anandtech and all other reviewers should either test with it forced off for everything or force it on; and the reviews should make it clear which way their system was tested. Leaving it on but not forced only makes their results less consistent, and is probably the worse way to address the problem.


    Forcing the HPET on will give you results that are consistent and accurate but completely meaningless. Forcing it off will in all likelihood do nothing at all, except perhaps cause an application to behave incorrectly in the rare instances where it needs the HPET. It's the better of the two options, but I still don't think it's a good option. Leaving it available on demand is the only way to have results that represent reality.
  • dalauder
    First gen Ryzens had acceptable performance with an awesome upgrade path, so people could justify it to help make the industry competitive. This doesn't need any argument to support a purchase other than benchmark performance, which is pretty cool, considering I've been waiting a decade.

    Is it as cool as getting HL3 would be? Note quite...but it measures on that scale.
  • 10tacle
    Anonymous said:
    First gen Ryzens had acceptable performance with an awesome upgrade path, so people could justify it to help make the industry competitive. This doesn't need any argument to support a purchase other than benchmark performance, which is pretty cool, considering I've been waiting a decade.

    Is it as cool as getting HL3 would be? Note quite...but it measures on that scale.


    Yep. there's no reason under pure gaming scenario I need to upgrade from my nearly four year old i5 4690K/Z97 Haswell build pushing a 1440p monitor since it's mostly on the GPU. However as I get more involved with video rendering (Handbrake, Vegas Studio), I need more CPU power and threading. That's where Zen wins over Intel.

    AMD also wins in supporting their chipsets for longer. Since my Haswell/Z97, we've seen three new generations of chipsets from Intel, and only two of those three have been compatible. And most early 7th-gen Kaby Lake motherboards required a BIOS flash with a 6th-gen Skylake chip.

    Regarding Half Life 3, that's the Area 51 for the gaming community. UFOs may or may not be real. But we can always dream. IMO that's the biggest devoid/lost opportunity in the gaming industry to this day. Such a shame.