Intel Core i7-7820X Skylake-X Review

Civilization VI & Battlefield 1

Civilization VI AI Test

Core i9-7900X suffered tremendously during our Civilization VI graphics test, but it clings onto a leading spot in the more CPU-focused AI benchmark. This is surprising considering the metric's preference for quad-core processors.

Even more surprising, Core i7-7820X falls to the bottom of our field. We retested several times with varying parameters, but there is no clear answer to explain the -7820X's lackluster performance. Overclocking helps the -7820X overtake a stock Ryzen 7 1800X, but it embarrassingly trails the tuned six-core Ryzen 5 1600X. 

Civilization VI Graphics Test

Core i7-7820X joins Core i9-7900X at the bottom of this chart, while the 8-core Broadwell-E-based -6900K takes pole position. The Ryzen models are very competitive; they outperform both Skylake-X processors.

Intel suggested to us that some programs might require optimizations to accommodate its new mesh topology. Civilization VI appears to be among them.  

Battlefield 1 (DX11)

The Core i7-7820X shows up ahead of Core i7-6900K on our chart, though the difference is tiny. Although Ryzen takes the bottom positions, again, we're talking about a slight loss of average frame rate compared to much more expensive CPUs. Without question, our Battlefield 1 benchmark is mostly graphics-bound.

MORE: Best CPUs

MORE: Intel & AMD Processor Hierarchy

MORE: All CPUs Content

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
57 comments
Comment from the forums
    Your comment
    Top Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Other Comments
  • cknobman
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
  • artk2219
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
  • Ne0Wolf7
    At least they've done something, but it still too expensive to sway me.
    Perhaps full blown profesionals who need something a bit better than what Ryzen has right now but can go for an i9 would appreciate this, but even hen he/she/it would probably wait to see what threadripper had to offer.
  • Scorpionking20
    So many years past, I can't wrap my head around this. Competition in the CPU space? WTH is this?
  • Houston_83
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.
  • artk2219
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    Ryzen does have 24 lanes, but only 16 are usable, 8 are dedicated to chipset and storage needs.
  • JimmiG
    Anonymous said:
    Anonymous said:
    I think the article has some incorrect information on the first page.

    "However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

    Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.


    It does, but only 16 are usable, 8 are used for chipset and storage needs.


    16X are available for graphics as 1x16 or 2x8.
    4X dedicated for M.2
    4X for the chipset that's split into 8x PCI-E v2 by the X370 and allocated dynamically IIRC
  • Zifm0nster
    Would love to give this a chip a spin.... but availability has been zero.... even a month after release.

    I actually do have work application which can utilize the multi-core.
  • Math Geek
    does look like intel was caught off guard by amd this time around.

    will take em a couple quarters to figure out what to do. but i'm loving the price/performance amd has brought to the table and know intel will have no choice but to cut prices.

    this is always good for the buyers :D
  • Amdlova
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?
  • JamesSneed
    Anonymous said:
    Why we have overclocked cpus ons bench but dont have power compsumation! this review is biased to intel again !? are tomshardware fake news ?


    I agree they should have completed the power consumption testing on the OC'ed chips from AMD and Intel. What I don't agree with is the bias, did you read the summary?
  • Amdlova
    yes i read but =) if put one side you need the other side, intel cpu can eat more than 500w on overclock its more than 96 core from amd system "EPYC". when I build a system the power is first thing and the cooling come to second. what i can see that have more power eat can i pay and can't use a aircooler need water forced injection to stay cool and you will have more watts need to keep it cool. 20w fans 18-20w for pump. my last system 4770k 4.3ghz passive cooler with thermalright ultra 120 extreme rev c.
  • AgentLozen
    Good review.

    When I read the 7900k review, I was really disappointed with Skylake X. Today's chip seems to make a lot more sense. It performs nearly as well as the 7900k in many situations. It eats less power. Generates less heat. Is priced more competitively. Typically beats the Ryzen 1800x in most situations also.

    If money wasn't a concern, I would say Intel's 7820x is a better CPU than AMD's 1800x. It's a different story when you look at the price tags though.

    Putting the 7700k in the benchmarks was a really interesting choice. It holds up well in a lot of situations despite having half the cores. The price is right too. Kaby Lake is still VERY relevant even in the HEDT realm.
  • redgarl
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."
  • PaulAlcorn
    Anonymous said:
    Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

    The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393


    Perhaps you missed this line in the conclusion.
    Quote:

    Intel should probably feel lucky that Core i7-7820X won't be going up against AMD's Threadripper, since the cheapest model will sell for $800. As it stands, this $600 CPU has a hard time justifying its premium over Ryzen 7 1800X, which currently sells for as little as $420
  • 10tacle
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...You guys are conscious that you are doing benches for not even 1% of the users? If you want to do 1080p benches so much, use 480/580 1050/1060.


    Seeketh, and yee shall findeth. While these benches do not reflect the 7820x in Tom's review, it does show where the 7700K stacks up against the 1800x shown on Tom's for extrapolation:

    http://images.anandtech.com/graphs/graph11549/87722.png
    http://images.anandtech.com/graphs/graph11549/87692.png
    http://images.anandtech.com/graphs/graph11549/87716.png
    http://images.anandtech.com/graphs/graph11549/87686.png
  • spdragoo
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).
  • AgentLozen
    That's exactly right spdragoo. This review has to follow basic scientific methodology. By using an overpowered GPU, the author narrows down the CPU variable. It makes sense.

    I think redgarl is just expecting something else and that's why they're disappointed.
  • PaulAlcorn
    Anonymous said:
    Anonymous said:
    ROFL 1080p benches with a 1080 GTX...

    You guys are conscious that you are doing benches for not even 1% of the users?

    If you want to do 1080p benches so much, use 480/580 1050/1060.

    TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

    Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."


    Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

    At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).


    Agreed. We also get lots of complaints that we don't test CPUs @ 4k, but....relevance. Due to time constraints, we want our results to be applicable to as many people as possible. Here's a good idea of what is relevant.


    Going by Steam's metrics, 3840X2160 is applicable to 0.86% of users. Granted, those are likely HEDT guys/gals.
    According to Steam, 93.34% of gamers are at 1920x1080 or below.

    2.19% are at 2560X1440.

    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    http://store.steampowered.com/hwsurvey/
  • 10tacle
    Anonymous said:
    I think redgarl is just expecting something else and that's why they're disappointed.


    Wouldn't be the first time either from that individual being snarky.

    Anonymous said:
    So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


    It is such a pet peeve of mine seeing people complain "well why not use realistic GPUs and resolutions people game with" and other nonsense. I've been on Tom's, Anandtech, and other hardware sites since the late '90s and it is not anything new to be testing CPUs in gaming benchmarks at below-optimal GPU resolutions (and over 90% of Steam gamers run 1080p or lower resolution monitors).

    What they also fail to understand is that quite a few 1080p gamers are running 144-165Hz monitors where high end cards make a difference. But back to historic reviews, case in point, a 2010 flashback review of an i7 950 using a 1680x1050 gaming resolution with a high end HD 5870 GPU:

    https://www.bit-tech.net/reviews/tech/cpus/intel-core-i7-950-review/6/

    Quote:
    We use 1,680 x 1,050 with no AA and no AF to give a reasonably real-world test without the risk that the graphics card will be a limiting factor to CPU performance.


    Why people still do not understand the logic behind ^^this I will never understand.