Part 2: How Many CPU Cores Do You Need?

Time To Follow-Up

A few months ago, we looked into the effectiveness of using different numbers of CPU cores with various types of software. We received a lot of good feedback from that article, and there were some interesting suggestions from the community that we've taken to heart in this follow-up.

Primarily, there was a concern that part one might have been flawed technically, as the Core 2 Quad Q6600 we used in our testing does not share all 8 MB of its L2 cache between its four CPU cores. Intel's Q6600 instead has two separate 4 MB cache repositories, each shared between one pair of CPU cores. This means the quad- and triple-core results would have demonstrated the CPUs utilizing 8 MB of total cache, while the dual- and single-core results show that they were likely benefiting from 4 MB. Indeed, the benchmarks may have been reflecting the difference in L2 cache availability more than performance attributable to enabled processing cores.

To remedy this, we are using a different CPU this time around: AMD's Phenom II X4 955 BE. There are a number of reasons why the Phenom II is ideal for these tests. First of all, its 6 MB of L3 cache is shared between all four CPU cores, so the cache's impact on results will be kept to a minimum. Secondly, since there are now X2, X3, and X4 versions of the Phenom II CPU based on the same die, we will have the opportunity to test the validity of the method we use to simulate fewer CPU cores. By comparing simulated results to an actual retail CPU with fewer CPU cores, we will know more definitively whether disabling CPU cores in the operating system is a truly legitimate test.

At the end of these tests, we will be able to compare the Phenom II X4 results with the ones achieved by Intel's Core 2 Quad Q6600 to see if the impact of shared CPU cache is dramatic or minimal.

A few readers were also interested in simulating a scenario where multiple applications are running at the same time, in order to gauge the benefit of additional CPU cores while multitasking. We therefore ran a new test to analyze this type of scenario, too.

  • erdinger
    Very intresting article,now I'm even happyery I bought a Phenom II 720 for my gaming rig!
    Reply
  • icepick314
    "In any case, there are two lessons to be learned here: first, try to avoid a virus scan during your gaming sessions."

    what kind of PC gamer does virus scanning while running a game?
    Reply
  • KyleSTL
    Why no power consumption testing? I was a little curious what disabling cores in the OS would do to power consumption under load. A little let down, but otherwise good article. It's good to see a scaling article at least yearly since people refer to the dual/quad debate so often and often the tests that were run within article that are referenced are out of date and irrelavent.
    Reply
  • Onus
    Good article, and very interesting.
    Now I really hope I can unlock the 4th core when my 720BE arrives (hopefully later this afternoon), but I won't sweat it.
    Did you happen to test if it made a difference what scan priority was set in AVG? I'd really like to see those numbers.
    Reply
  • So, how did you manage to get an Nvidia-based graphics card (Gigabyte GV-N250ZL-1GI 1 GB DDR3 PCIe) up and running with the ATI Catalyst 9.6 drivers?! ;-)

    Besides that bit of confusion, thanks for the benchmarks!
    Reply
  • 1word
    very happy with my 720 BE. I constantly check with the activity on the cores, and many many apps use all three cores, or multi- tasking uses all the 3 cores. some activities like defrag uses only 2 cores. image editing software, and general applications like browsers, office apps use all three cores, especially when multi tasking.

    i'm very happy with the AMD 720BE.
    Reply
  • jcknouse
    KyleSTLWhy no power consumption testing? I was a little curious what disabling cores in the OS would do to power consumption under load. A little let down, but otherwise good article. It's good to see a scaling article at least yearly since people refer to the dual/quad debate so often and often the tests that were run within article that are referenced are out of date and irrelavent.
    I liked the article well, but I was too finding myself asking "What was more power efficient? the PII x2 550 BE or the PII x2 955 BE?

    Would love to know, even if it was just that you guys just happened to glance at a P3 Kill-a-watt or some other meter you had inline during testing or something.

    Thanks for great work, guys :)
    Reply
  • erichlund
    It's true that an application like iTunes does not benefit from multiple cores, when run without any other apps. However, it also doesn't compete for more than one core when multiple apps are running, so single threaded apps also benefit from multiple cores when users are multi-tasking.

    What one really needs to know with iTunes and it's competing applications is: Which one competes most efficiently in a multi-processing environment? In other words, which uses the least resources while performing essential tasks, leaving the most resources for the other tasks being performed? To say it in perhaps the clearest way, what applications play well with other types while multi-tasking, and which hog resources, making it more difficult to multi-task?

    That's not really the point of this test, but it may lead to some interesting future evaluations.
    Reply
  • Onus
    ^Yes, that's why it would be interesting to see if (and how much) the impact varied if AVG was set to slow, normal, or fast for its scan priority.
    Reply
  • paranoidmage
    You shouldn't test the games at 1024x786 at low details. These benchmarks are supposed to simulate actual usage. No one will actually run games at that resolution and detail unless their computer is a dinosaur. If you want to remove bottlenecks, use a better GPU like a 4890.

    How do I know if multiple core will actually help me? I run games at 1920x1200 with med-high details.
    Reply