Sign in with
Sign up | Sign in
Call Of Duty: Ghosts Graphics Performance: 17 Cards, Tested
By ,
1. Duty Calls: Welcome To The Ghosts, Son

I think it's safe to say that Call of Duty defined, and then refined, the console-based first-person shooter experience. It's so prolific that the series' popularity might even suffer from its own success. Today, it's fashionable to beat the franchise up, which often happens with anything that over-saturates pop culture. Regardless, Activision claimed over $1 billion in sales on launch day. Love it or hate it, but Call of Duty clearly has a devoted following.

Does anything change in its most recent installment? Not really; the formula remains the same. That's not to say it's a bad recipe. You get high production value, excellent voice acting, solid first-person shooter mechanics, and a Hollywood story. But if you were hoping that Infinity Ward would redefine its genre with Ghosts...well, that didn't happen.

The company does change and add a few features, though. Ghosts introduces a Squads mode that lets you create and customize a team of computer-controlled soldiers. It's not part of the single-player campaign, but can be played offline or against other players. As usual, there are new multiplayer modes, too, such as Search and Rescue, Kill Confirmed, Infected, and Blitz. A four-player co-op mode called Extinction has players defend a base from alien invaders. Make no mistake, there's a lot to keep you busy once you're done with Call of Duty: Ghosts' campaign. It's just that none of it pushes the envelope of what we've come to expect. Enemy AI is just as dumb as it was back in Modern Warfare 3.

The single-player story is the element that strays furthest from previous Call of Duty games. Infinity Ward must have guessed that gamers are tired of fighting Germans, Russians, Asians, and Middle Eastern countries. So, this time around, the bad guys are South American. Yes, our equatorial neighbors turned the U.S.' doomsday weapon against itself, crippling the country and forcing it into a 10-year-long defensive campaign against the evil (and technically superior) South American Federation. Oh, and the U.S. put up a 100-foot concrete wall along the border to protect what's left of the country, so illegal immigration is no longer an issue.

Yes, the premise is utterly ridiculous. On the plus side, it gives you an opportunity to defend decimated, destroyed, and decaying urban American environments from invading enemy forces, which is cool (although at times it seems derived from Crysis 3). True to Call of Duty's formulaic approach, standard first-person shooter fare is mixed with mini-game-like tasks, such as controlling drones for airstrikes or robotic turrets, that you have to engage in. There's also Riley, the loyal German Shepherd that you can send skulking through the grass until you need him rip out the Adam's apple of Federation baddies straying too far from their lines. The dog mechanic isn't particularly compelling. But that doesn't matter because humans are predisposed to bonding with canines, right? Despite my cynicism, I can't help but love the damn simulated dog.

There's not much else to add. Yes, it's an old recipe. Yes, it can get tiresome. And yes, it's often more enjoyable than I care to admit. I guess that's why Call of Duty sells so well. It plays to the lowest common denominator in all of us, like a wrestling match or a Michael Bay film.

2. Game Engine, Image Quality, And Settings

Call of Duty: Ghosts is built on the IW6 engine, a modified and updated version of the technology used in Call of Duty: Modern Warfare 3. Some of the improvements include Pixar's SubD surfaces, which increases the detail of models as you get closer, real-time HDR lighting, Iris Adjust technology (which mimics how eyes react to changing lighting conditions), new animation systems, fluid dynamics, interactive smoke, displacement mapping, and dynamic multiplayer maps.

Like most recent Call of Duty games, it looks quite good, but then breaks down under scrutiny. There are far too many objects and characters that lack shadows, even at the highest detail settings, and especially when you zoom in with a scope. Crysis and Battlefield are both a solid step above what Ghosts offers.

One of my pet peeves on the PC is that Call of Duty: Ghosts does not natively support multi-display gaming. The developers hide behind the excuse that wider fields of view give certain players an unfair advantage. But if that's the case, why not enable technologies like Eyefinity or Surround in single-player mode? Why not allow competitive leagues to opt in or out when it comes to more expansive views? This issue might be more complicated than I'm giving it credit for, but it seems shameful for a high-profile title to lack multi-monitor support in 2013.

From a PC enthusiast's perspective, the Image Quality setting is perhaps most irritating. You're given the choice between Very Low, Low, Normal, High, and Extra. But the problem is that this term is misleading; it doesn't control game effects like shadows. Instead, it manipulates render quality. Every setting except Extra renders at a lower target than the resolution you choose. For example:

On my high-end Core i7, Radeon R9 280X-equipped PC, Call of Duty: Ghosts automatically chooses the High option, which renders to a target lower than my selected output, resulting in terrible blurriness. This may fly as a necessity in the console world, but I'm on a PC for a reason. In my opinion, two things: first, call the setting what it is, a render scale, and second, don't auto-select anything except for Extra on a PC. There's no better way to give away a poor port than a legacy switch needed for suitable performance on a fixed platform.

We tested low, medium, and high-end settings that will work on a wide range of PC graphics hardware. Our low preset implies minimum detail levels across the board except for image quality (set to normal), and textures (set to auto). Our high preset involves image quality set to Extra, depth of field enabled, SSAO set to Low, anisotropic filtering at Normal, distortion enabled, anti-aliasing set to FXAA, textures set to High, and terrain detail and motion blur disabled. Our ultra preset is performed with every setting at the maximum detail option, and anti-aliasing set to SMAA.

Our test system, detailed on the next page, hosts 8 GB of system RAM. On release, Call of Duty: Ghosts required at least 6 GB, but was since patched to operate with less memory.

3. Test Hardware: Graphics Cards And Platform

As always, we strive to represent game performance across a wide range of graphics hardware. We include cards ranging from the low-end Radeon HD 6450 and GeForce GT 210 to the powerful Radeon R9 290X, HD 7990, GeForce GTX Titan, and 690.

We had a couple of openings in our hardware line-up, but graphics card manufacturer XFX came to the rescue and supplied a few samples for this review:

XFX Radeon R9 290X Core Edition

Currently, all Radeon R9 290X cards bear AMD's reference thermal solution, but this is the board XFX is branding. Armed with 4 GB of 1250 MHz GDDR5 memory on a 512-bit bus, it also features 2560 shaders, 160 texture units, 64 ROPs, and greatly improved geometry processing capabilities compared to its predecessor.

XFX Radeon R9 280X Double Edition

This Radeon R9 280X is equipped with XFX's Double Dissipation cooling solution, known for quiet and efficient operation thanks to dual 100 mm fans. It sports 3 GB of GDDR5 on a 384-bit bus and AMD's Tahiti GPU. While it's not as fast as the R9 290 family, it's still as capable as the Radeon HD 7970 we know so well.

XFX Radeon HD 7990 Core Edition

It might be considered a previous-generation card but, the Radeon HD 7990 remains AMD's fastest dual-slot graphics card. Essentially two Radeon HD 7970s on a single PCB, this beast's dual Tahiti GPU setup adds up to 4096 shaders, 256 texture units, and 64 ROPs. 

We all know that graphics cards like the Radeon HD 7990 require a substantial amount of power, so XFX sent along its PRO850W 80 PLUS Bronze-certified power supply. This modular PSU employs a single +12 V rail rated for 70 A. XFX claims that this unit provides 850 W of continuous power (not peak) at 50 degrees Celsius (notably higher than the inside of most enclosures).

We've almost exclusively eliminated mechanical disks in the lab, preferring solid-state storage for eliminating I/O-related bottlenecks. Samsung sent all of our labs 256 GB 840 Pros, so we standardize on these exceptional SSDs.



CPU
Intel Core i5-2550K (Sandy Bridge), Overclocked to 4.2 GHz @ 1.3 V
Motherboard
Asus P8Z77-V LX, LGA 1155, Chipset: Intel Z77M
Networking
On-Board Gigabit LAN controller
Memory
AMD Gamer Series Memory, 2 x 4 GB, 1866 MT/s, CL 9-9-9-24-1T
Graphics
GeForce 210 1 GB DDR3
GeForce GT 630 512 MB GDDR5
GeForce GTX 650 Ti 1 GB GDDR5
GeForce GTX 660 2 GB GDDR5
GeForce GTX 670 2 GB GDDR5
GeForce GTX 770 2 GB GDDR5
GeForce GTX Titan 6 GB GDDR5
GeForce GTX 690 4 GB GDDR5

Radeon HD 6450 512 MB GDDR5
Radeon HD 6670 512 MB DDR3
Radeon HD 7770 1 GB GDDR5
Radeon R7 260X 1 GB GDDR5
Radeon R9 270 2 GB GDDR5
Radeon HD 7950 Boost 3 GB GDDR5
Radeon R9 280X 3 GB GDDR5
Radeon R9 290X 4 GB GDDR5
Radeon HD 7990 6 GB GDDR5
Hard Drive
Samsung 840 Pro, 256 GB SSD, SATA 6Gb/s
Power
XFX PRO850W, ATX12V, EPS12V
Software and Drivers
Operating System
Microsoft Windows 8 Pro x64
DirectX
DirectX 11
Graphics Drivers
AMD Catalyst 13.11 Beta 9.2, Nvidia GeForce 331.65 WHQL
Benchmarks
Call Of Duty: Ghosts
Custom THG Benchmark, 60-second Fraps run
Campaign: Homecoming
4. Results: Low Quality, 1280x720

As always, we begin with low-end graphics hardware to gauge what Call of Duty: Ghosts' minimum requirements really are. These tests are performed with detail levels down as far as they can go (except for the texture setting at Auto and image quality set to Normal). Of course, even then, the Normal image quality preset still means this game is upscaling from a lower resolution.

Nvidia's GeForce 210 is missing from this chart because we couldn't get it to launch the game. Based on Radeon HD 6450 performance, though, the entry-level GeForce card wouldn't have been fast enough anyway. Even at 1280x720, the GeForce GT 630 equipped with GDDR5 can't sustain 30 frames per second. It takes a Radeon HD 6570 DDR3, at least, to deliver ample speed.

The GeForce GTX 650 Ti is overkill at this resolution and combination of settings, so a GeForce GT 640 would probably match up well against the Radeon HD 6570.

Observed frame time variance is relatively low across the board, although the GeForce GTX 650 Ti encounters a few spikes during the course of the benchmark, achieving a comparatively poorer result.

5. Results: Low Quality, 1680x1050

Using the same settings as last time, we kick the resolution up to 1680x1050. The Normal image quality setting is still upscaling what we see on-screen from a lower render target, though. Because of this, average frames per second are barely any lower than the 1280x720 results.

Once again, AMD's Radeon HD 6570 DDR3 is the lowest you'd want to go for playability.

The first frame time variance chart shows us that latency typically isn't an issue, though graphing 300 frames shows some spikes on a few of the cards we're testing.

6. Results: High Quality, 1680x1050

Let's kick the details up a notch with settings that deliver a more attractive output. First, image quality is set to Extra, ensuring that the game is rendering at full resolution, and not upscaling. Depth of field, distortion, and shadows are turned on, while screen space ambient occlusion is enabled at the Low setting. We also turn on FXAA to get rid of the jaggies.

Even the lowest-end cards we're testing manage around 30 FPS at minimum and 40 FPS on average. So, we'll call the Radeon HD 7770 and GeForce GTX 650 Ti good baseline boards for 1680x1050 at this detail level.

One of the settings we turned on exacts a big latency penalty compared to the previous page. Faster boards like the Radeon R9 270X tend to be alright, but there's still something going on that affects the consistent performance.

7. Results: High Quality, 1920x1080

Now we apply those same taxing settings to a more popular enthusiast-oriented resolution: 1920x1080. Because we're actually rendering to a higher resolution, performance is expected to drop compared to the numbers at 1680x1050.

The Radeon HD 7770 and GeForce GTX 650 Ti don't cut it anymore. To achieve at least 30 frames per second, we need at least a Radeon R7 260X (also known as a Radeon HD 7790) or GeForce GTX 650 Ti Boost.

Although most of the frame time latencies aren't bad, worst-case results spike well over 10 ms. Only the Radeon HD 7950 Boost and rebranded Radeon R9 270 demonstrate reasonably low variance.

8. Results: Ultra Quality, 1920x1080

Truth be told, there's not a ton of visual difference between the High settings we just used and the maxed-out configuration on this page. Then again, it's nice to know how much graphics hardware you need in order to push every slider to its highest position.

Even the lowest-end cards cards we used for this test, the Radeon HD 7950 Boost and GeForce GTX 670, maintain frame rates in excess of 30 FPS. This means the game is at least playable across the eight cards we picked for this resolution and these detail settings.

The frame time variance numbers are a little less promising, though. The GeForce GTX 690 delivers a pretty good result and the Radeon R9 290X really shines. Some of the slower options run into fairly high worst-case latency spikes, though.

9. Results: Ultra Quality, 2560x1600

Since multi-monitor resolutions aren't an option in Ghosts, we'll wrap up the graphics card testing with our highest possible single-screen resolution of 2560x1600.

Driving more than 4 million pixels per frame pushes the Radeon HD 7950 Boost to a 25 FPS minimum, while the GeForce GTX 670 dips to 29 FPS. AMD's Radeon R9 280X and Nvidia's GeForce GTX 770 end up a couple of frames per second higher, and it takes a Radeon R9 290X or GeForce GTX Titan to carve out some headroom above our identified minimum.

The Radeon HD 7990's results are particularly interesting. Its average frame rates are pretty good, but they drop under 30 FPS in the most demanding parts of our benchmark. That result is reflected in real-world game play, too; the card definitely feels choppy.

Although the worst-case figures don't look particularly bad for AMD's dual-GPU flagship, its average and seventy-fifth percentile results are higher than we'd expect them to be. In contrast, the GeForce GTX 770, Titan, 690, and Radeon R9 290X look relatively good. Sampling 300 frames over time shows that consistency is somewhat of an issue.

10. CPU Benchmarks

The Call of Duty games aren't known to be particularly processor-bound. So, in an effort to identify bottlenecks, we apply the highest possible graphics settings and drop the resolution to 1680x1050.

The $110 FX-4170 manages to maintain minimum frame rates in excess of 35 FPS, and all other CPUs are around the 40 FPS mark or higher except for AMD's Phenom II X4 965. That's a bit of a surprise, since the Phenom II X4 usually passes or matches the FX-4170 in games.

However, there's also quite a bit of scaling going on, with averages from 43.7 FPS up to nearly 80 FPS using the same graphics card. Pay special attention to this; if you overdo it on graphics and shoot too low on host processing, there's a good chance you could artificially limit performance.

There are some spikes in the frame time variance chart. However, based on what we saw after applying Call of Duty's highest-end settings previously, they're most likely related to interplay between one of the game's features and our GeForce GTX Titan.

11. Call Of Duty: Ghosts: Good With A $150 GPU And $110 CPU

Call Of Duty: Ghosts is graphically interesting. And while it doesn't do anything innovative in terms of game play, it's at least as fun as its predecessors. The question is, what sort of hardware do you need in order to enjoy the title?

When it comes to your graphics card, we wouldn't bother playing the game at its lowest settings without a Radeon HD 6570 DDR3 or GeForce GT 640, though that hardware won't be satisfactory above 1680x1050. If you want to get the most from Call of Duty with more visual realism, you want at least a Radeon R7 260X (a rebranded Radeon HD 7790) or GeForce GTX 650 Ti Boost. Those cards should be good for 1920x1080 with a minimum frame rate above 30 FPS.

If you're the type of enthusiast who feels compelled to turn every setting up as high as it'll go, you need a Radeon HD 7950 Boost or GeForce GTX 670 (or its equivalent, a 760) to play the game at 1080p. At 2560x1600, it takes a Radeon R9 280X (rebranded Radeon HD 7970) or GeForce GTX 770 (close to a rebranded GeForce GTX 680) to maintain fluid frame rates.

What about your host processor? The good news is that every CPU we tested managed frame rates above 30 FPS, except for AMD's Phenom II X4 965. Of course, that was also paired up to a GeForce GTX Titan. There will come a point where lower-end graphics cards turn into a bottleneck, and you'll see lower performance. It appears that a combination of high core count and aggressive clock rates helps the FX's case. However, Intel's architecture maintains its dominance, as the two-generation-old Core i5-2500K still finishes first in our chart. Naturally, Ivy Bridge- and Haswell-based chips are going to fare even better at the same or higher clock rates.

In the end, it takes about $110 of CPU and $150 of graphics card to run Call of Duty: Ghosts smoothly at 1080p with High details enabled. That's not a particularly high bar, but it's more than we were expecting from such a mainstream title. Were the developers concentrating their efforts on next-gen consoles this time around? Perhaps. That would have certainly made sense, given the Xbox One and PlayStation 4 release frenzy. If you're playing on the PC, though, make sure your Image Quality option is set to Extra. Otherwise you're going to get upscaled blurriness.