I'm curious how you tell if a CPU will limit your graphics card?
I'm wanting to purchase a new CPU and card - I was thinking about a Core i5-750 paired with an Nvidia 460GTX. The goal is to use the computer for WOW at 1680x1050 with Ultra Settings enabled. I just don't know how to tell if the CPU will hold the card back.
Intel Core i7 CPU Scaling with HD 6850 in Crossfire
When you really begin to break it down, though, the times we hit a CPU limitation we’re already getting fantastic performance in games. Resident Evil 5 is a prime example. At 1680 x 1050 and our CPU at 3.06GHz we’ve got an average FPS of 178, while at 4.2GHz we’ve got an average FPS off 223.
Sure, that’s a great gain, but that extra performance isn’t going to make the game any better for you. Even in a situation where you’re using a 120Hz monitor, you’re at a number that’s well and truly comfortable when at 3.06GHz.
I don’t want to give the impression that buying a faster CPU is a waste of money, as we know that there’s more to the purchase of a CPU than just gaming. But it’s important to know that under situations where your video detail is high, the added speed of not just a few hundred MHz, but more than a GHz at time, can yield almost no performance increase when it comes to your actual FPS. Of course, as you climb up to higher end video card setups that come in at an excess of $1,000, then the more speed is going to be appreciated. That’s something we may look at when the HD 6900 series begin to arrive.
I'd guess WoW wouldn't be more CPU-intensive than Resident Evil 5, so at your preferred resolution anything above 120fps would probably be wasted. See this thread for the RE5 chart..