Call Of Duty: Ghosts Graphics Performance: 17 Cards, Tested
It's already a commercial blockbuster. But does Call of Duty: Ghosts improve the first-person shooter genre, or simply rehash it? We look at this series' newest installment and test to see what kind of hardware you'll need for smooth play on the PC.
Game Engine, Image Quality, And Settings
Call of Duty: Ghosts is built on the IW6 engine, a modified and updated version of the technology used in Call of Duty: Modern Warfare 3. Some of the improvements include Pixar's SubD surfaces, which increases the detail of models as you get closer, real-time HDR lighting, Iris Adjust technology (which mimics how eyes react to changing lighting conditions), new animation systems, fluid dynamics, interactive smoke, displacement mapping, and dynamic multiplayer maps.
Like most recent Call of Duty games, it looks quite good, but then breaks down under scrutiny. There are far too many objects and characters that lack shadows, even at the highest detail settings, and especially when you zoom in with a scope. Crysis and Battlefield are both a solid step above what Ghosts offers.
One of my pet peeves on the PC is that Call of Duty: Ghosts does not natively support multi-display gaming. The developers hide behind the excuse that wider fields of view give certain players an unfair advantage. But if that's the case, why not enable technologies like Eyefinity or Surround in single-player mode? Why not allow competitive leagues to opt in or out when it comes to more expansive views? This issue might be more complicated than I'm giving it credit for, but it seems shameful for a high-profile title to lack multi-monitor support in 2013.
From a PC enthusiast's perspective, the Image Quality setting is perhaps most irritating. You're given the choice between Very Low, Low, Normal, High, and Extra. But the problem is that this term is misleading; it doesn't control game effects like shadows. Instead, it manipulates render quality. Every setting except Extra renders at a lower target than the resolution you choose. For example:
On my high-end Core i7, Radeon R9 280X-equipped PC, Call of Duty: Ghosts automatically chooses the High option, which renders to a target lower than my selected output, resulting in terrible blurriness. This may fly as a necessity in the console world, but I'm on a PC for a reason. In my opinion, two things: first, call the setting what it is, a render scale, and second, don't auto-select anything except for Extra on a PC. There's no better way to give away a poor port than a legacy switch needed for suitable performance on a fixed platform.
We tested low, medium, and high-end settings that will work on a wide range of PC graphics hardware. Our low preset implies minimum detail levels across the board except for image quality (set to normal), and textures (set to auto). Our high preset involves image quality set to Extra, depth of field enabled, SSAO set to Low, anisotropic filtering at Normal, distortion enabled, anti-aliasing set to FXAA, textures set to High, and terrain detail and motion blur disabled. Our ultra preset is performed with every setting at the maximum detail option, and anti-aliasing set to SMAA.
Our test system, detailed on the next page, hosts 8 GB of system RAM. On release, Call of Duty: Ghosts required at least 6 GB, but was since patched to operate with less memory.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Current page: Game Engine, Image Quality, And Settings
Prev Page Duty Calls: Welcome To The Ghosts, Son Next Page Test Hardware: Graphics Cards And Platform-
jimmysmitty I think it's safe to say that Call of Duty defined, and then refined, the console-based first-person shooter experience
It is funny to see this as CoD1 and CoD2 were originally PC games. CoD2 was the first to be ported to the 360 but CoD3 was the first multi-console one of the series, with no release on the PC.
I loved 1 and 2 and 4 was pretty good but now CoD is just the same thing every year. It's just a cash cow currently with no innovation while 1 & 2 were very innovative (CoD1 was the first to have real recorded sounds for every gun used in the game).
I haven't done a CoD since 2. It's too bad as it could have been a great series if it didn't become console and money centric.
Also, on page 9 the chart for the FPS says Battlefield 4...... -
lunyone If you have a PhII x4 965 BE, you can just OC it to get a bit more FPS if you like, so there is that option. Obviously you want more CPU, but not all of us have the $ to do so.Reply -
Cons29 my last cod was mw2 which i stopped playing due to lack of dedicated server. The last i enjoyed was cod4.Reply
bf is much better (personal opinion), 64 players on a huge map with vehicles and desctructions, better than cod -
Frank Zigfreed Loving these game graphics performance reviews!!! keep them coming tomshardware!!Reply
B -
animeman59 Been playing this game on PC ever since it's release, and I gotta say, this is probably one of the worst performing games that I've ever seen. I'm running an FX-8350, a GTX 780, and 32GB of RAM, and this game will still dip below 45fps. I don't care what anyone says, but CoD and IW6 should be running with no issues on a rig like that. It's a little suspicious when I can get 60fps consistent on a game like Battlefield 4 with max settings, but CoD:Ghosts stutters like Porky Pig. Even Metro: Last Light runs better than CoD:Ghosts!Reply
This game is horribly optimized and buggy. People on Steam forums have been complaining about game-breaking bugs from day one, and there's still issues that haven't been answered for, yet. Like the one in Squad Mode where you can't use any of your squad members in a game, except for the first one. Or the earlier bug where people couldn't even create their first soldier, because they didn't have 3 squad points to unlock it, hence locking them out of multiplayer.
Skip out on this game. Infinity Ward obviously doesn't care about the PC market, and their horrible release just further solidifies that fact. Spend your money on a MP shooter that doesn't insult it's audience. -
lunyone 12095017 said:Been playing this game on PC ever since it's release, and I gotta say, this is probably one of the worst performing games that I've ever seen. I'm running an FX-8350, a GTX 780, and 32GB of RAM, and this game will still dip below 45fps. I don't care what anyone says, but CoD and IW6 should be running with no issues on a rig like that. It's a little suspicious when I can get 60fps consistent on a game like Battlefield 4 with max settings, but CoD:Ghosts stutters like Porky Pig. Even Metro: Last Light runs better than CoD:Ghosts!
This game is horribly optimized and buggy. People on Steam forums have been complaining about game-breaking bugs from day one, and there's still issues that haven't been answered for, yet. Like the one in Squad Mode where you can't use any of your squad members in a game, except for the first one. Or the earlier bug where people couldn't even create their first soldier, because they didn't have 3 squad points to unlock it, hence locking them out of multiplayer.
Skip out on this game. Infinity Ward obviously doesn't care about the PC market, and their horrible release just further solidifies that fact. Spend your money on a MP shooter that doesn't insult it's audience.
Quake or Unreal Tournament, anyone? -
oxiide 12095151 said:LOL @ NVidia frame variance
I get that you're trying to phrase that as an AMD fanboy taking a shot at Nvidia, but frame variance is all over the place in this review. There's AMD hardware all over those charts too, not just clustered at the low end.
These frame variance numbers often aren't even logical—the HD 7990, with lower frame variance than a single HD 7950? A GTX 690 doing better than a single 670? I think its clear that the quality of Infinity Ward's PC port is a factor here, and maybe that's more important than pouncing on Nvidia's mistakes. -
bemused_fred 12095017 said:. I'm running an FX-8350, a GTX 780, and 32GB of RAM,
A mediocre-CPU with a top end GPU and too much RAM? I FOUND YOUR PROBLEM!