Intel Core i7-7820X Skylake-X Review

What we've seen of Intel's newest HEDT platform so far hasn't inspired much excitement. First, there were complaints of high temperatures and limited overclocking, addressed in The Skylake-X Mess Explored: Thermal Paste And Runaway Power. Then we collectively scratched our heads, wondering what the company was thinking in our Intel Core i7-7740X Kaby Lake-X Review.

But there are plenty of options between the quad-core Core i5 and Intel's 36-thread Core i9 flagship. In fact, the X-series includes nine models this time around, more than any other HEDT family to date. And it's the mid-range Core i7s that we expect to be most popular due to their tamer price points.

In yet another sign of a renewed fighting spirit, Intel's $600 Core i7-7820X slots in below the $1000 Core i9-7900X. That big $400 step down from the 10-core model is uncharacteristic for Intel. Its older eight-core Core i7-6900K bore a shocking $1100 price tag. No doubt, Intel is looking to stave off AMD's Ryzen 7 models. While the $500 "savings" versus its previous generation is certainly nice, however, Intel continues to struggle against AMD's disruptive pricing scheme and looser approach to segmentation. 

The Core i7-7820X has eight Hyper-Threaded cores, so comparisons to AMD's Ryzen 7 models are inevitable. The $600 -7820X does battle against the $500 Ryzen 7 1800X. And as a result of unlocked multipliers up and down AMD's portfolio, even the $330 Ryzen 7 1700 is a viable competitor. Intel continues to enjoy an advantage in most lightly-threaded workloads, but the company just can't match Ryzen 7's value, particularly in workloads able to exercise all eight cores. It also helps that AM4-based motherboards are a lot less expensive.

Of course, Core i7-7820X drops into an LGA 2066 interface on motherboards with the X299 "Basin Falls" platform controller hub. We already discussed how processor choice can severely limit this chipset's connectivity in our Core i7-7740X review. And fortunately, Core i7-7820X doesn't suffer nearly as much as Kaby Lake-X. However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead.

Core i7-7820X features a 3.6 GHz base clock that boosts up to 4.3 GHz across two cores in lightly threaded workloads. That's a marked increase over what the Broadwell-E-based Core i7-6900K could do. Further, -7820X supports Turbo Boost Max Technology 3.0, which can push the CPU's two "best" cores up to 4.5 GHz using a piece of installed software. In theory, that should allow Skylake-X to dominate single- and multi-threaded benchmarks alike.

Intel also officially supports up to DDR4-2666 across the -7820X's quad-channel memory controller. Compared to Ryzen 7's dual-channel design, Skylake-X can theoretically move a lot more data, which is useful in certain prosumer applications.

Similar to the Core i9-7900X we already reviewed, -7820X is rated for up to 140W. If you're curious about what that number means to power consumption, heat, and overclocking headroom, check out the aforementioned deep-dive (The Skylake-X Mess Explored: Thermal Paste And Runaway Power) for more.

And if you'd like some more background on Intel's 14nm Skylake-X architecture, we'd encourage you to read through Intel Core i9-7900X Review: Meet Skylake-X, where we introduce the new mesh topology, cache hierarchy (-7820X boasts 8MB of L2 and 11MB of L3), and fresh ISA extensions (unfortunately, -7820X loses one AVX-512-capable unit per core compared to -7900X). 

Speed Shift, which allows the processor to handle power-state transitions autonomously, also makes its debut on the high-end desktop. The tactic eliminates latent operating system commands and provides faster resumption times from lower power states. That equates to a snappier experience. Intel also includes support for the vROC (Virtual RAID on CPU) feature that allows you to coalesce up to 20 SSDs into a single bootable volume, though you'll have to buy an upgrade key to unlock it. Intel remains curiously silent on pricing, and keys aren't available yet.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
57 comments
Comment from the forums
    Your comment
    Top Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Other Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Ne0Wolf7
    At least they've done something, but it still too expensive to sway me.
    Perhaps full blown profesionals who need something a bit better than what Ryzen has right now but can go for an i9 would appreciate this, but even hen he/she/it would probably wait to see what threadripper had to offer.
  • Scorpionking20
    So many years past, I can't wrap my head around this. Competition in the CPU space? WTH is this?
  • Houston_83
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.
  • artk2219
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    Ryzen does have 24 lanes, but only 16 are usable, 8 are dedicated to chipset and storage needs.
  • JimmiG
    Anonymous said:
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    It does, but only 16 are usable, 8 are used for chipset and storage needs.


    16X are available for graphics as 1x16 or 2x8.
    4X dedicated for M.2
    4X for the chipset that's split into 8x PCI-E v2 by the X370 and allocated dynamically IIRC
  • Zifm0nster
    Would love to give this a chip a spin.... but availability has been zero.... even a month after release.

    I actually do have work application which can utilize the multi-core.
  • Math Geek
    does look like intel was caught off guard by amd this time around.

    will take em a couple quarters to figure out what to do. but i'm loving the price/performance amd has brought to the table and know intel will have no choice but to cut prices.

    this is always good for the buyers :D
  • Amdlova
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?
  • JamesSneed
    Anonymous said:
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?


    I agree they should have completed the power consumption testing on the OC'ed chips from AMD and Intel. What I don't agree with is the bias, did you read the summary?
  • Amdlova
    yes i read but =) if put one side you need the other side, intel cpu can eat more than 500w on overclock its more than 96 core from amd system "EPYC". when I build a system the power is first thing and the cooling come to second. what i can see that have more power eat can i pay and can't use a aircooler need water forced injection to stay cool and you will have more watts need to keep it cool. 20w fans 18-20w for pump. my last system 4770k 4.3ghz passive cooler with thermalright ultra 120 extreme rev c.
  • AgentLozen
    Good review.

    When I read the 7900k review, I was really disappointed with Skylake X. Today's chip seems to make a lot more sense. It performs nearly as well as the 7900k in many situations. It eats less power. Generates less heat. Is priced more competitively. Typically beats the Ryzen 1800x in most situations also.

    If money wasn't a concern, I would say Intel's 7820x is a better CPU than AMD's 1800x. It's a different story when you look at the price tags though.

    Putting the 7700k in the benchmarks was a really interesting choice. It holds up well in a lot of situations despite having half the cores. The price is right too. Kaby Lake is still VERY relevant even in the HEDT realm.
  • redgarl
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."
  • PaulAlcorn
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Perhaps you missed this line in the conclusion.
    Quote:

    Intel should probably feel lucky that Core i7-7820X won't be going up against AMD's Threadripper, since the cheapest model will sell for $800. As it stands, this $600 CPU has a hard time justifying its premium over Ryzen 7 1800X, which currently sells for as little as $420
  • 10tacle
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...You guys are conscious that you are doing benches for not even 1% of the users? If you want to do 1080p benches so much, use 480/580 1050/1060.


    Seeketh, and yee shall findeth. While these benches do not reflect the 7820x in Tom's review, it does show where the 7700K stacks up against the 1800x shown on Tom's for extrapolation:

    http://images.anandtech.com/graphs/graph11549/87722.png
    http://images.anandtech.com/graphs/graph11549/87692.png
    http://images.anandtech.com/graphs/graph11549/87716.png
    http://images.anandtech.com/graphs/graph11549/87686.png
  • spdragoo
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).
  • AgentLozen
    That's exactly right spdragoo. This review has to follow basic scientific methodology. By using an overpowered GPU, the author narrows down the CPU variable. It makes sense.

    I think redgarl is just expecting something else and that's why they're disappointed.
  • PaulAlcorn
    Anonymous said:
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).


    Agreed. We also get lots of complaints that we don't test CPUs @ 4k, but....relevance. Due to time constraints, we want our results to be applicable to as many people as possible. Here's a good idea of what is relevant.


    Going by Steam's metrics, 3840X2160 is applicable to 0.86% of users. Granted, those are likely HEDT guys/gals.
    According to Steam, 93.34% of gamers are at 1920x1080 or below.

    2.19% are at 2560X1440.

    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    http://store.steampowered.com/hwsurvey/
  • 10tacle
    Anonymous said:
    I think redgarl is just expecting something else and that's why they're disappointed.


    Wouldn't be the first time either from that individual being snarky.

    Anonymous said:
    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    It is such a pet peeve of mine seeing people complain "well why not use realistic GPUs and resolutions people game with" and other nonsense. I've been on Tom's, Anandtech, and other hardware sites since the late '90s and it is not anything new to be testing CPUs in gaming benchmarks at below-optimal GPU resolutions (and over 90% of Steam gamers run 1080p or lower resolution monitors).

    What they also fail to understand is that quite a few 1080p gamers are running 144-165Hz monitors where high end cards make a difference. But back to historic reviews, case in point, a 2010 flashback review of an i7 950 using a 1680x1050 gaming resolution with a high end HD 5870 GPU:

    https://www.bit-tech.net/reviews/tech/cpus/intel-core-i7-950-review/6/

    Quote:
    We use 1,680 x 1,050 with no AA and no AF to give a reasonably real-world test without the risk that the graphics card will be a limiting factor to CPU performance.


    Why people still do not understand the logic behind ^^this I will never understand.