Intel Core i7-7820X Skylake-X Review

Workstation & HPC Performance

2D Benchmarks: DirectX and GDI/GDI+

If you want to know more about our HPC benchmarks, check out the AMD Ryzen 7 1800X CPU Review. We didn't just copy results from that story, though. Rather, after a number of BIOS updates and software configuration changes, we retested everything. This gives us a more up-to-date picture, reflecting improvements of up to 15% that AMD worked hard to enable.

Intel's Core i7-7820X outstrips the -7900X in our AutoCAD 2D workload due to its frequency and IPC throughput advantage. The Core i9-7900X wins in the GDI/GDI+ benchmarks, though. Both processors provide more performance than a Broadwell-E-based Core i7-6900K. 

2D Benchmarks: Adobe Creative Cloud

Per-cycle performance plays a role in these lightly-threaded applications, giving the -7820X an advantage in several tests. Both Skylake-X models suffer lower performance than we'd expect in the Photoshop Heavy and InDesign workloads. Hopefully Adobe is planning an update that'll address this anomaly.

3D Benchmarks: DirectX and OpenGL

The Core i7-7700K vigorously cuts through most of these workloads, indicating that prefer high clock rates, all else being equal.

Both Skylake-X-based chips trade places through several of the tests; the distance between them remains small, though.

CPU Performance: Workstation

Broadwell-E leads the Skylake-X-based processors in a few of these workloads, reminding us that Intel's mesh topology may lead to performance regressions in some cases.

The Ryzen 7 1800X is incredibly competitive during this round of testing.

CPU Performance: Photorealistic Rendering

Rendering benefits from brute-force parallelism, so the 10-core Core i9-7900X naturally provides the best performance.

The workload utilizes all cores fully, so it also provides a good multi-threaded comparison between the eight-core -7820X and Ryzen 7 1800X. Intel's processor takes the lead due to its per-cycle performance advantage, but Ryzen is surprisingly competitive given its lower price and value-oriented platform. It also doesn't require a custom water-cooling loop to reach its potential, whereas Skylake-X does.

CPU Performance: Encoding & Compression/Decompression

The -7820X falls into a predictable place during our threaded encoding workload. Nipping at its heels is AMD's nagging (and much less expensive) Ryzen 7 1800X.

Core i7-7820X struggles mightily with our lightly-threaded decompression workload. Its place in the chart is much lower than we'd expect, given the way Intel implements its Turbo Boost technology.

High Performance Computing (HPC)

Complex HPC applications largely benefit from the -7820X's high clock rate and beefy core count. But aside from the SRMP workload, AMD's Ryzen 7 1800X again proves to be the fly in Intel's high-priced ointment.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
57 comments
Comment from the forums
    Your comment
    Top Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Other Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Ne0Wolf7
    At least they've done something, but it still too expensive to sway me.
    Perhaps full blown profesionals who need something a bit better than what Ryzen has right now but can go for an i9 would appreciate this, but even hen he/she/it would probably wait to see what threadripper had to offer.
  • Scorpionking20
    So many years past, I can't wrap my head around this. Competition in the CPU space? WTH is this?
  • Houston_83
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.
  • artk2219
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    Ryzen does have 24 lanes, but only 16 are usable, 8 are dedicated to chipset and storage needs.
  • JimmiG
    Anonymous said:
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    It does, but only 16 are usable, 8 are used for chipset and storage needs.


    16X are available for graphics as 1x16 or 2x8.
    4X dedicated for M.2
    4X for the chipset that's split into 8x PCI-E v2 by the X370 and allocated dynamically IIRC
  • Zifm0nster
    Would love to give this a chip a spin.... but availability has been zero.... even a month after release.

    I actually do have work application which can utilize the multi-core.
  • Math Geek
    does look like intel was caught off guard by amd this time around.

    will take em a couple quarters to figure out what to do. but i'm loving the price/performance amd has brought to the table and know intel will have no choice but to cut prices.

    this is always good for the buyers :D
  • Amdlova
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?
  • JamesSneed
    Anonymous said:
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?


    I agree they should have completed the power consumption testing on the OC'ed chips from AMD and Intel. What I don't agree with is the bias, did you read the summary?
  • Amdlova
    yes i read but =) if put one side you need the other side, intel cpu can eat more than 500w on overclock its more than 96 core from amd system "EPYC". when I build a system the power is first thing and the cooling come to second. what i can see that have more power eat can i pay and can't use a aircooler need water forced injection to stay cool and you will have more watts need to keep it cool. 20w fans 18-20w for pump. my last system 4770k 4.3ghz passive cooler with thermalright ultra 120 extreme rev c.
  • AgentLozen
    Good review.

    When I read the 7900k review, I was really disappointed with Skylake X. Today's chip seems to make a lot more sense. It performs nearly as well as the 7900k in many situations. It eats less power. Generates less heat. Is priced more competitively. Typically beats the Ryzen 1800x in most situations also.

    If money wasn't a concern, I would say Intel's 7820x is a better CPU than AMD's 1800x. It's a different story when you look at the price tags though.

    Putting the 7700k in the benchmarks was a really interesting choice. It holds up well in a lot of situations despite having half the cores. The price is right too. Kaby Lake is still VERY relevant even in the HEDT realm.
  • redgarl
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."
  • PaulAlcorn
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Perhaps you missed this line in the conclusion.
    Quote:

    Intel should probably feel lucky that Core i7-7820X won't be going up against AMD's Threadripper, since the cheapest model will sell for $800. As it stands, this $600 CPU has a hard time justifying its premium over Ryzen 7 1800X, which currently sells for as little as $420
  • 10tacle
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...You guys are conscious that you are doing benches for not even 1% of the users? If you want to do 1080p benches so much, use 480/580 1050/1060.


    Seeketh, and yee shall findeth. While these benches do not reflect the 7820x in Tom's review, it does show where the 7700K stacks up against the 1800x shown on Tom's for extrapolation:

    http://images.anandtech.com/graphs/graph11549/87722.png
    http://images.anandtech.com/graphs/graph11549/87692.png
    http://images.anandtech.com/graphs/graph11549/87716.png
    http://images.anandtech.com/graphs/graph11549/87686.png
  • spdragoo
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).
  • AgentLozen
    That's exactly right spdragoo. This review has to follow basic scientific methodology. By using an overpowered GPU, the author narrows down the CPU variable. It makes sense.

    I think redgarl is just expecting something else and that's why they're disappointed.
  • PaulAlcorn
    Anonymous said:
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).


    Agreed. We also get lots of complaints that we don't test CPUs @ 4k, but....relevance. Due to time constraints, we want our results to be applicable to as many people as possible. Here's a good idea of what is relevant.


    Going by Steam's metrics, 3840X2160 is applicable to 0.86% of users. Granted, those are likely HEDT guys/gals.
    According to Steam, 93.34% of gamers are at 1920x1080 or below.

    2.19% are at 2560X1440.

    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    http://store.steampowered.com/hwsurvey/
  • 10tacle
    Anonymous said:
    I think redgarl is just expecting something else and that's why they're disappointed.


    Wouldn't be the first time either from that individual being snarky.

    Anonymous said:
    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    It is such a pet peeve of mine seeing people complain "well why not use realistic GPUs and resolutions people game with" and other nonsense. I've been on Tom's, Anandtech, and other hardware sites since the late '90s and it is not anything new to be testing CPUs in gaming benchmarks at below-optimal GPU resolutions (and over 90% of Steam gamers run 1080p or lower resolution monitors).

    What they also fail to understand is that quite a few 1080p gamers are running 144-165Hz monitors where high end cards make a difference. But back to historic reviews, case in point, a 2010 flashback review of an i7 950 using a 1680x1050 gaming resolution with a high end HD 5870 GPU:

    https://www.bit-tech.net/reviews/tech/cpus/intel-core-i7-950-review/6/

    Quote:
    We use 1,680 x 1,050 with no AA and no AF to give a reasonably real-world test without the risk that the graphics card will be a limiting factor to CPU performance.


    Why people still do not understand the logic behind ^^this I will never understand.