Imagine Grand Theft Auto played back inside of Edward Snowden's worst nightmare. You now have a good grasp of Watch Dogs' premise. The game is set in Chicago, in a not-too-distant future where the entire city is run by a single operating system that knows everything about everyone all of the time. It's not the stereotypical dystopian vision of a heavy-handed big brother scenario; the Chicago operating system isn't sentient, nor is it malevolent. For the most part, it's a tool that provides a lot of convenience for its citizens. Lights at intersections are intelligently controlled so traffic jams are a thing of the past. People synchronize their schedules with smart software that actually makes their lives better. Understandably, criminals have a difficult time conducting business in a city that's aware of the location of all of its citizens. On the surface, such an existence sounds like it might be compelling, right?

Of course, the price everyone pays for this convenience is privacy. And no matter how altruistic the intentions of its visionary creators, the operating system offers incalculable potential for abuse. There's an obvious allegory to what's currently happening in our modern world.

The player explores this universe through the eyes of Aiden Pierce, a cyber-thief whose mentor hacks the wrong network during a heist and draws the unwanted attention of bad guys behind the curtain. Of course, the protagonist is targeted. His niece is tragically killed in an attack meant to scare him, and Aiden subsequently becomes a vigilante.
He puts his talents to work by coding software that uses the Chicago operating system to predict violent crime based on the location and attitude of its citizens (leading us to ask the obvious question, why doesn't the game's police department do this?). Aiden has to wait until crime is actually perpetrated before stepping in. Otherwise, the criminal is scared off before doing anything wrong. Minority Report, anyone?

But enough about the plot. How is the gameplay? It's a sandbox-style title obviously influenced by the GTA series, but with a few Deus-Ex-style twists. Aiden's smartphone is his key to controlling the Chicago OS. Among other things, it can change traffic lights, stop trains, hack other phones, control security cameras, raise bridges, engage barricades, and simply blow things up. It's definitely a cool dynamic.
In addition, the profiler app is always searching for potential crimes, and it'll direct you to different locations. There are many side missions, including a feature similar to Far Cry 3's radio towers. If you're a fan of sandbox games, I can almost guarantee you'll find something you like.

My favorite character is Chicago itself. The metropolis' digital incarnation has a great flavor, and the city-themed songs on the game's radio channels make for welcome background noise. For automotive enthusiasts like myself, you'll spot many vehicles that pay blatant homage to specific models. For instance, there's a car with an 80s Pontiac Firebird body and 70s Firebird front-end. From modern Dodge Chargers, Cadillacs, and Lamborghinis to the Volkswagen Rabbit and Honda Civic, a seemingly infinite number of different vehicles in the game were plucked from reality and given subtle changes. I haven't seen better adaptations of real-world cars since Burnout Paradise, so clearly the developer chose to create its own designs rather than pay royalties to an automaker.

What didn't I like? Well, for a vigilante fighting on behalf of the people, Aiden has surprisingly little conscience when it comes to stealing the common man's property. I suppose you could make a case that the character is complicated, but it doesn't feel plausible that a guy who cares enough about human beings to put his life at risk and save them from gun-toting thugs is also totally cool with absconding with their savings or partnering up with killers to achieve his goals. The game also makes it a challenge to engage in a heated car chase without mowing down innocent pedestrians. You'd think Aiden would be emotionally crippled by hitting a mother at a bus stop, given his original motivations. In that context, I would have appreciated if bystanders were harder to mow down. Maybe I'm just a terrible driver, but if you can finish the game without the blood of innocents on your hands, you have my respect.

I didn't play through as much of the story as I wanted, since my primary purpose was finding a taxing and consistent benchmark run. In the end, I chose a pre-planned path through the outskirts of the city, driving for 90 seconds per test. Thus, the results are quite repeatable, despite the many variables this game introduces.
We'll get to the performance results in a couple of pages. First, let's look at the game engine and its settings.
Watch Dogs is built on the new Disrupt engine, which evolved from pieces of Assassin's Creed's AnvilNext engine (for city mechanics) and Far Cry 3's Dunia engine (for vegetation and AI). It's a good-looking game at low-quality details and much more impressive with all of the bells and whistles enabled. But it's no CryEngine when it comes to pushing the visual envelope.

The benchmarks will show that this game taxes PC hardware. Even the most entry-level settings require respectable components to run smoothly, and high frame rates can be disrupted by hiccups now and again, which is distracting. We tested using more advanced graphics processors that anything you'll find in a console, so I have to guess that the PC version isn't as finely-optimized as the eight-gen console builds have to be.

Given such a strenuous demand on hardware at the higher detail settings, I chose to test with post-process anti-aliasing. Also, I selected FXAA instead of the game's advanced SMAA setting. While SMAA improved aliasing on objects, it didn't do as good of a job with artifacts on textures, which were particularly distracting on the road when driving.

Now that we know how Watch Dogs looks, let's see how the game performs on a variety of graphics cards and processors.
I wanted to test Watch Dogs in Surround and Eyefinity configurations, using three monitors, but wasn't able to get the aspect ratio correct using either GeForce or Radeon cards. Hopefully this is something the developers address in a patch. For now, we'll focus on single-monitor performance.

As always, we strive to represent results across a wide range of graphics hardware. Unfortunately, I don't have a GeForce GTX 780 Ti in our Canadian lab, so I did the next best thing and overclocked a GeForce GTX Titan to 780 Ti-like performance levels. This took a 1750 MHz memory clock rate (equal to the 780 Ti) and a 994 MHz GPU Boost clock to compensate for the CUDA core deficiency. I also set the power envelope to its maximum 106% setting in Afterburner. Resulting performance should come close to a GeForce GTX 780 Ti.
Drivers are a huge deal when a game first comes out, and we made it a point to test using the very latest software from both companies. Nvidia gave us early access to its 337.88 build, while AMD sent over Catalyst 14.6 Beta, both including optimizations for Watch Dogs. Indeed, AMD's performance went up quite a bit under the new driver; it was fairly dismal with 14.4.
Also, the texture detail setting is kept at Medium across our benchmarks, ensuring graphics memory doesn't affect our results too severely. The texture detail option is separate from the game's detail presets, so it's adjustable on its own.

Graphics cards like the Radeon R9 290X require a substantial amount of power, so XFX sent us its PRO850W 80 PLUS Bronze-certified power supply. This modular PSU employs a single +12 V rail rated for 70 A. XFX claims continuous (not peak) output of up to 850 W at 50 degrees Celsius.

We've almost exclusively eliminated mechanical disks in the lab, preferring solid-state storage for alleviating I/O-related bottlenecks. Samsung sent all of our labs 256 GB 840 Pros, so we standardize on these exceptional SSDs.
| Test System | |||||
|---|---|---|---|---|---|
| CPU | Intel Core i7-3960X (Sandy Bridge-E), 3.3 GHz, Six Cores, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading enabled. | ||||
| Motherboard | ASRock X79 Extreme9 (LGA 2011) Chipset: Intel X79 Express | ||||
| Networking | On-Board Gigabit LAN controller | ||||
| Memory | Corsair Vengeance LP PC3-16000, 4 x 4 GB, 1600 MT/s, CL 8-8-8-24-2T | ||||
| Graphics | GeForce GT 630 512 MB GDDR5 GeForce GTX 650 2 GB GDDR5 GeForce GTX 750 Ti 2 GB GDDR5 GeForce GTX 660 2 GB GDDR5 GeForce GTX 760 2 GB GDDR5 GeForce GTX 780 Ti 3 GB GDDR5 Radeon HD 6450 512 MB GDDR5 Radeon R7 240 1 GB DDR3 Radeon R7 250X 1 GB GDDR5 Radeon R7 260X 1 GB GDDR5 Radeon R9 270 2 GB GDDR5 Radeon R9 280 3 GB GDDR5 Radeon R9 290X 4 GB GDDR5 | ||||
| SSD | Samsung 840 Pro, 256 GB SSD, SATA 6Gb/s | ||||
| Power | XFX PRO850W, ATX12V, EPS12V | ||||
| Software and Drivers | |||||
| Operating System | Microsoft Windows 8 Pro x64 | ||||
| DirectX | DirectX 11 | ||||
| Graphics Drivers | AMD Catalyst 14.6 Beta, Nvidia GeForce 337.88 WHQL | ||||
| Benchmarks | ||||||||
|---|---|---|---|---|---|---|---|---|
| Watch Dogs | Custom Tom's Hardware Benchmark, 90-second Fraps run, Driving | |||||||
Our first tests were run on low-end graphics cards from the Radeon HD 6450 and GeForce GT 630 GDDR5 to the Radeon R7 250X and GeForce GTX 650. We chose the minimum detail settings coupled with FXAA anti-aliasing at 1280x720.

The Radeon HD 6450 is unplayable, but the GeForce GT 630 GDDR5 and Radeon R7 240 squeeze out passable performance with 27- and 30-FPS minimums, respectively. AMD's Radeon R7 250X leads the budget pack, never dropping below 70 FPS.

Frame rates over time are fairly consistent with this class of graphics card. Nvidia's GeForce GTX 650 barely drops below the 60 FPS threshold, and the Radeon R7 250X remains above this chart's upper bound.


There's a significant amount of frame time variance, which manifests as occasional stutters on the affected cards. AMD's Radeon HD 6450 is affected most, though the Radeon R7 240 also suffers compared to the competition.
We retain the same entry-level settings used on the previous page, but increase the resolution from 720p to 1080p. Also, the Radeon R7 260X and GeForce GTX 750 Ti are added to the list of tested cards, reflecting their greater graphics processing potential.


The Radeon R7 240 and GeForce GT 630 GDDR5 cannot cope with a more demanding resolution. Everything else stays above my 30 FPS minimum target, though.


As before, the Radeon R7 240 suffers severe frame time variance spikes, though that doesn't really matter, since its minimum frame rate is too low anyway. Generally speaking, the spikes do get more pronounced for all cards, which again is reflected as occasional stuttering, regardless average frame rate.
This time around we keep our 1920x1080 resolution, but increase graphics detail to the Ultra preset. Quality improves noticeably; however, graphics hardware is also pushed much more aggressively.


It takes a Radeon R9 270 or GeForce GTX 760 to render at least 30 FPS at these settings.
Check out the Radeon R7 260X and GeForce GTX 660, both of which encounter significant dips as they progress through our benchmark run. I generated multiple data sets to confirm this behavior, and it's indeed repeatable. Only our overclocked GeForce GTX Titan (approximating GeForce GTX 780 Ti performance) achieves well over a 35-FPS minimum rate.


Once again, there are a lot of latency spikes across many of the cards we're testing. Nvidia's GeForce GTX Titan is the only board exempt from the punishment.
Without the ability to test triple-monitor resolutions, I had to settle for the largest single resolution at my disposal, a QHD screen's native 2560x1440.


The number of cards worth testing shrinks at this taxing combination of high resolution and Ultra graphics details. Only AMD's Radeon R9 280 and 290X keep their noses above a 30 FPS minimum, while Nvidia's overclocked GeForce GTX Titan delivers a 15-FPS-higher result. It's also the only board capable of approaching a 60 FPS average.
The GeForce GTX 760 would have done better, but it encounters a single valley in its frame rate trace, hammering the minimum result. After testing and retesting, observing the same dip each time, its result stands. Similarly, the Radeon R9 270 and GeForce GTX 660 get hit by rogue dips that push minimums further into unplayable territory.


Frame time variance is a mess in this chart, though if you focus only on the higher-performance cards, it's not as bad. In short, Nvidia's overclocked GeForce GTX Titan is the only option approaching a truly smooth result.
In order to explore CPU scaling, I used the overclocked GeForce GTX Titan on multiple platforms and the Ultra detail preset at 1080p.


Intel and, to a lesser extent, AMD will like this: even pushing the Ultra detail preset, Watch Dogs is a CPU-bound game. Ubisoft Montreal developed the PC version in concert with its console efforts, and optimizations for those platforms seems apparent. Programming for the PlayStation's and Xbox's host processors involves getting the most out of fairly lightweight platforms. Naturally, then, well-threaded desktop CPUs benefit strongly.
When it comes to gaming, we rarely see the FX-8350 outperform a Core i3. But AMD's flagship walks away with a clear win in that match-up. Based on what we're seeing, serious fans of the game will want an FX-6000-series chip at least, but an FX-8000 or Core i5 should be much better.


Low-end CPUs suffer most, as we'd expect, when frame time variance is measured. Both the FX-8350 and Core i7 exhibit a couple of spikes too, but they demonstrate much less variance on average.

Watch Dogs is a surprisingly demanding game, particularly when you consider the console hardware it's also running on. But at its most entry-level detail settings and a 1280x720 resolution, you can get by with a Radeon R7 240 or GeForce GT 630 GDDR5. At 1920x1080, you want a GeForce GTX 650 or Radeon R7 250X to keep your nose above 30 FPS. But a Radeon R7 260X or GeForce GTX 750 Ti is going to save you from a lot of the stuttering we observed.

Step up to the highest detail levels, though, and you'll want a Radeon R9 270 or GeForce GTX 760 to run at 1080p. At any resolution higher than that, shoot for high-end hardware like the Radeon R9 290 or GeForce GTX 780. Indeed, the best experience we had was with Nvidia's GeForce GTX Titan overclocked to achieve performance similar to a GeForce GTX 780 Ti.

Even with the Ultra detail preset enabled, which you'd think would shift the workload toward graphics, a strong host processor is surprisingly critical. While the Core i3-3220 and FX-4170 only mustered a 24 FPS minimum, the FX-6300 almost hit 30. The FX-8350 and Core i5-3550 managed a more tolerable 37-88 FPS, and Intel's Core i7-3960X lead by not dropping under 51 FPS.
Of course, you can mitigate the performance hit by lowering your detail settings, bringing frame rates back up, but that somewhat defeats the purpose of gaming on a PC. Make sure you have an FX-6000-series CPU at minimum to enjoy Watch Dogs at higher graphics quality settings, but a Core i5 or FX-8000 would be much better. The publisher recommends a Core i7 for the best possible performance, and we have to agree.

As for the game itself, Watch Dogs is a little bit of GTA with a hearty helping of Deus Ex and a dash of Far Cry 3. I wouldn't go so far as to say it's any better than those titles, but if you're into the sandbox genre, I'm sure you'll find something to enjoy. The PC build sells for $60 on Amazon, and Nvidia bundles it with certain GeForce cards.