Nvidia Says Core i7 Isn't Worth It

The Intel Core i7 chip is an awesome CPU – this we know. If we were to build a gaming rig, we’d want to have one of these inside it. But Nvidia is telling everyone that the CPU isn’t everything.

Intel claims that gaming performance goes up by 80 percent when you use a Core i7 chip. This impressed Nvidia’s technical marketing director Tom Petersen, who decided to take a closer look at Intel’s claim.

“I was impressed by that claim, and I was trying to figure out how they could possibly say such a thing, and it turns out that Intel is basing that claim on only 3DMark Vantage’s CPU test.”

Of course, a CPU test is just that – to test the CPU. Peterson goes on to explain his view: “…it doesn’t actually measure gameplay, it doesn’t actually measure anything about game performance. Sure enough, if you do that test you will see Core i7 running faster, but I think it’s a little disingenuous to call that game performance.”

Peterson then transitioned to an example that would further his case that Core i7 isn’t the clear superior choice for the gaming PC. He compared two systems, calling the Core i7 965-based one a “Hummer,” and likening the one with a Core 2 Duo E8400 to a BMW.

Nvidia showed benchmark graphs of various systems running Crysis Warhead, Fallout 3, Call of Duty: World at War and Far Cry 2 at 1920 x 1200 (no AA or AF). According to bit-tech.net, the Core 2 Duo E8400 and a GeForce GTS 250 scored an average of 41.6 fps. The frame rate moved slightly up to 42.4 fps after upgrading to a Core i7 965, but jumped all the way up to 59.4 fps after upgrading to a GeForce GTX 260 (216 stream processors) SLI setup.

Here we have a case where the games running at 1920 x 1200 are fillrate-bound rather than CPU. A faster CPU did little to make things better for the GPU, but upgrading to a significantly stronger 3D acceleration setup opened up the headroom for more frames.

Peterson acknowledges that at high-resolutions, it’s smarter to spend on buying more fillrate: “…it is a fact, that when you’re gaming and you’re running at resolutions of 1920 x 1200 or better, the Core 2 Duo is perfect for running all of today’s games. In real gaming, there’s no difference between a Core i7 and a Core 2 Duo.”

Marcus Yam
Marcus Yam served as Tom's Hardware News Director during 2008-2014. He entered tech media in the late 90s and fondly remembers the days when an overclocked Celeron 300A and Voodoo2 SLI comprised a gaming rig with the ultimate street cred.
  • Eggrenade
    Wow, answering B.S. stats with more B.S. stats. Of course you can "prove" your point if you run a very limited scenario. This is why we come here, for a lot less B.S. and some real scenarios.
    Reply
  • 1971Rhino
    I had no idea I had a "BMW" in my case.......awesome!
    Reply
  • ricin
    As a game engine developer, I grow tired of Nvidia rhetoric. First off, to make such a broad generalization is sickening. What they *are* saying, even if they attempt to mean something else, is that good game engines today are *GPU* bound and not CPU bound, meaning that the GPUs are so slow that the CPUs spend a ton of time waiting on them.

    Idiots. It doesn't even make sense.
    Reply
  • JTP709
    It sounds to me like Nvidia did in fact answer Intel's BS stats with a real world scenario: running Far Cry, Crysis, and other games on their processors along with different Nvidia cards, delivering actual frame rates in what we would actually see if we bought that system. Once again reaffirming my belief that the i7 is overkill for gaming.
    Reply
  • Hatecrime69
    EggrenadeWow, answering B.S. stats with more B.S. stats. Of course you can "prove" your point if you run a very limited scenario. This is why we come here, for a lot less B.S. and some real scenarios.
    Well intel's stats were bs, and nvidia could have probably done a less weighted benchmark in their favor, but they are telling the truth, in many game situations a core 2 duo/quad is just a few frames slower than the much more expensive i7..of course rts and flight sim games may show more cpu difference but at least nvidia is showing an example of what is true for the most part in many popular titles
    Reply
  • thepinkpanther
    why didnt Nvidia just use AMD chips? and see if performance went up by 80%? it should have been Phenom II + gts 250, then gtx285. Then the same test with i7. I guess intel has to say BS until their larabee (i dont know how to spell) comes out, then they will say GPU is just as/if not more important.
    Reply
  • burnley14
    What a shock: a GPU manufacturing company tells you to spend more on GPUs and less on CPUs. That sounds completely unbiased . . .
    Reply
  • afrobacon
    So the i7 is a Hummer?
    I disagree with this statement, the hummer of today is slow, wastes gas, any is pretty much a glorified minivan. The i7 on the other hand is fast, lightweight, outperforms it's competition, etc. Though still pricey and not quite worth getting one yet. My opinion at least.
    Reply
  • burnley14
    What a shock, a GPU-producing company is encouraging you to spend less on CPUs and more on GPUs. That doesn't seem biased at all . . .
    Reply
  • burnley14
    Damn double post. Sorry.
    Reply