IvyBridge vs Haswell for Desktop Gaming?

I'm planning on upgrading from an old C2D, I can't decide if to upgrade to IvyBridge now or wait a few more months and upgrade to Haswell when that comes out.

I've read allot about Haswell increasing integrated GPU performance but as I am running a dedicated graphics card thats not very exciting for me. Has anyone heard if there will be much benefit for people who aren't too concerned about power usage and integrated graphics?

Opinions?
12 answers Last reply
More about ivybridge haswell desktop gaming
  1. Well let you know as soon as someone can actually run benchmarks on a Haswell chip.
  2. Haswell. Unless you are itching to play something currently than I would just wait. I have c2q so haswell will be a big upgrade also. It is just speculation, but it probably will be worth a wait unless intel decides not to really push the cpu's to the next level.
  3. I know its mostly speculation atm as no benchmarks have been released, but does anyone know of any press articles mentioning desktop performance? Everything I can find seems to relate to low power integrated devices, like tablets.
  4. Estimates are 10-15% faster at the same clock speed as Ivy for the CPU.

    I wouldn't wait for Haswell if using a C2Duo.
  5. It should be a decent improvement, it's a "tock" in their release cycle after all. But it's not like Haswell will completely leave Ivy Bridge in the dust, so if you need the upgrade now there's nothing wrong with getting an Ivy Bridge CPU now.
    Edit: And pricing may not be particularly attractive at launch anyway.
  6. Gaming has become much more GPU dependent and that trend isn't changing. A quad-core IB will be plenty for years to come for games. If you get a k-series you can squeeze some more longevity out of it once it starts showing its age. I'd go Ivy now and skip Haswell/Broadwell especially if your primary use is gaming. If you are suffering with your current system, then the wait isn't worth it.

    I think ultimately your situation with your current C2D system should dictate what you decide. If it's a drag, then get an Ivy now. If it's enough for now then wait (and save up for a monster GPU for the new system ; )
  7. Assuming we are talking about i5 chips, a i5-3570k is roughly 3x faster than the fastest Core 2 Duo. So that's a 300% improvement. Assuming Haswell was a further 10% improvement, that would be a 330% improvement over C2D.

    If you had a lower end C2D, you would be closer to a 400% improvement by going to an i5-3570.

    So you can get a 3-4x speed boost right now, or limp along for a 3.3-4.4x speed boost around the end of next summer. Both are going to be so much faster than what you have, it seems kind of silly to wait. Especially for gaming, where Ivy Bridge just doesn't really bottleneck.
  8. Besides the world could end on 12/21/12. Might as well enjoy that new PC. ;)
  9. twelve25 said:
    Assuming we are talking about i5 chips, a i5-3570k is roughly 3x faster than the fastest Core 2 Duo. So that's a 300% improvement. Assuming Haswell was a further 10% improvement, that would be a 330% improvement over C2D.

    If you had a lower end C2D, you would be closer to a 400% improvement by going to an i5-3570.

    So you can get a 3-4x speed boost right now, or limp along for a 3.3-4.4x speed boost around the end of next summer. Both are going to be so much faster than what you have, it seems kind of silly to wait. Especially for gaming, where Ivy Bridge just doesn't really bottleneck.



    Those figures you provided are excessively overstated. Yes, certain tasks the i5-3570k can be 3x faster than a C2D E8600 like video encoding with the x.264 codec. But that is not the norm. Encoding video using DviX codec is less than a 2x performance increase. Playing Dragon Age Origins @ 1680x1050 gives more than 2x the frame rates, but in Dawn of War II @ 1680x1050, the difference is only 33%.

    See link below for benchmark comparisons. Note that a few are merely synthetic benchmarks which is different from actual benchmarks like 7-Zip and the Microsoft 2007 benchmark.

    http://www.anandtech.com/bench/Product/54?vs=701
  10. How does that disprove my point?
  11. jaguarskx said:
    Those figures you provided are excessively overstated. Yes, certain tasks the i5-3570k can be 3x faster than a C2D E8600 like video encoding with the x.264 codec. But that is not the norm. Encoding video using DviX codec is less than a 2x performance increase. Playing Dragon Age Origins @ 1680x1050 gives more than 2x the frame rates, but in Dawn of War II @ 1680x1050, the difference is only 33%.

    See link below for benchmark comparisons. Note that a few are merely synthetic benchmarks which is different from actual benchmarks like 7-Zip and the Microsoft 2007 benchmark.

    http://www.anandtech.com/bench/Product/54?vs=701

    That may be because the games become GPU limited instead of CPU limited, though.
  12. twelve25 said:
    How does that disprove my point?


    The way you stated you reply implies that everything will get a 3x - 4x boost in performance which is not the case. While cherry picked benchmarks for Ivy Bridge can provide 3x better the performance (like video encoding using the x.264 codec, but not any other video codec) than the Core 2 Duo / Quad, in reality it varies by the program and on average the performance increase is much less.

    Most games are bound by the graphics card rather than the CPU. While a much more powerful CPU can help increase performance to some extent the performance increase will be relatively small or non-existent. For example, Crysis 2 is a game that does not care how fast the CPU is as long as it does not bottleneck the GPU. I recall one benchmark which had several CPUs including a Core i3-2100 and a Core i5-2500k @ 4.5GHz, the i5-2500k might have gotten 1 extra frame.

    On the other hand, there are some games that clearly do benefit faster CPUs like Skyrim, but going from a Core 2 Duo to a Core i5-3570k is not going to increase performance by 3x. Nowhere even close to that. I'll also add that Skyrim only utilizes two cores.
Ask a new question

Read More

CPUs Gaming Desktops