Results: Low Quality, 1680x1050
Using the same settings as last time, we kick the resolution up to 1680x1050. The Normal image quality setting is still upscaling what we see on-screen from a lower render target, though. Because of this, average frames per second are barely any lower than the 1280x720 results.
Once again, AMD's Radeon HD 6570 DDR3 is the lowest you'd want to go for playability.
The first frame time variance chart shows us that latency typically isn't an issue, though graphing 300 frames shows some spikes on a few of the cards we're testing.
It is funny to see this as CoD1 and CoD2 were originally PC games. CoD2 was the first to be ported to the 360 but CoD3 was the first multi-console one of the series, with no release on the PC.
I loved 1 and 2 and 4 was pretty good but now CoD is just the same thing every year. It's just a cash cow currently with no innovation while 1 & 2 were very innovative (CoD1 was the first to have real recorded sounds for every gun used in the game).
I haven't done a CoD since 2. It's too bad as it could have been a great series if it didn't become console and money centric.
Also, on page 9 the chart for the FPS says Battlefield 4......
bf is much better (personal opinion), 64 players on a huge map with vehicles and desctructions, better than cod
This game is horribly optimized and buggy. People on Steam forums have been complaining about game-breaking bugs from day one, and there's still issues that haven't been answered for, yet. Like the one in Squad Mode where you can't use any of your squad members in a game, except for the first one. Or the earlier bug where people couldn't even create their first soldier, because they didn't have 3 squad points to unlock it, hence locking them out of multiplayer.
Skip out on this game. Infinity Ward obviously doesn't care about the PC market, and their horrible release just further solidifies that fact. Spend your money on a MP shooter that doesn't insult it's audience.
Quake or Unreal Tournament, anyone?
I get that you're trying to phrase that as an AMD fanboy taking a shot at Nvidia, but frame variance is all over the place in this review. There's AMD hardware all over those charts too, not just clustered at the low end.
These frame variance numbers often aren't even logical—the HD 7990, with lower frame variance than a single HD 7950? A GTX 690 doing better than a single 670? I think its clear that the quality of Infinity Ward's PC port is a factor here, and maybe that's more important than pouncing on Nvidia's mistakes.
A mediocre-CPU with a top end GPU and too much RAM? I FOUND YOUR PROBLEM!