
Watch Dogs is a surprisingly demanding game, particularly when you consider the console hardware it's also running on. But at its most entry-level detail settings and a 1280x720 resolution, you can get by with a Radeon R7 240 or GeForce GT 630 GDDR5. At 1920x1080, you want a GeForce GTX 650 or Radeon R7 250X to keep your nose above 30 FPS. But a Radeon R7 260X or GeForce GTX 750 Ti is going to save you from a lot of the stuttering we observed.

Step up to the highest detail levels, though, and you'll want a Radeon R9 270 or GeForce GTX 760 to run at 1080p. At any resolution higher than that, shoot for high-end hardware like the Radeon R9 290 or GeForce GTX 780. Indeed, the best experience we had was with Nvidia's GeForce GTX Titan overclocked to achieve performance similar to a GeForce GTX 780 Ti.

Even with the Ultra detail preset enabled, which you'd think would shift the workload toward graphics, a strong host processor is surprisingly critical. While the Core i3-3220 and FX-4170 only mustered a 24 FPS minimum, the FX-6300 almost hit 30. The FX-8350 and Core i5-3550 managed a more tolerable 37-88 FPS, and Intel's Core i7-3960X lead by not dropping under 51 FPS.
Of course, you can mitigate the performance hit by lowering your detail settings, bringing frame rates back up, but that somewhat defeats the purpose of gaming on a PC. Make sure you have an FX-6000-series CPU at minimum to enjoy Watch Dogs at higher graphics quality settings, but a Core i5 or FX-8000 would be much better. The publisher recommends a Core i7 for the best possible performance, and we have to agree.

As for the game itself, Watch Dogs is a little bit of GTA with a hearty helping of Deus Ex and a dash of Far Cry 3. I wouldn't go so far as to say it's any better than those titles, but if you're into the sandbox genre, I'm sure you'll find something to enjoy. The PC build sells for $60 on Amazon, and Nvidia bundles it with certain GeForce cards.
the most popular gaming cpu in the world.
Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?
+1 also on what @Patrick Tobin said.
I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.
Thank you!
does the game utilize SLI or Crossfire setup on PC?
Techspot did include those and no difference between I5 and I7 not even lga2011 hexa core!
The game looks beautiful.
and then the FX-8350 against a freaking i7-3960X and NO OTHER intel CPU. [edited for language]
For freak sakes i am really trying to follow you as a serios tech-site without bias,
please do not make it any freaking harder for me.
edit: Actually someone did some nice tests for CPUS:
CPU performance with GTX 780
CPU performance with R9 290X
I hope there is a followup article, focusing on some specific details. These include VRAM limitations, and more tweaking to see which settings changes most affect not only raw FPS but also smoothness. It looks like some settings lead to a very distracting experience, and it would be nice to know what those are.
Edit: Thanks, Don, for adding the FX-6300 and i5-3550; those are useful numbers to have. Here is one title where the FX clearly beats the i3, so core count must matter.
Claiming something wins where 98% of us NEVER play is ridiculous. You want to know who wins in 98% of users cases. Those fps are too low for me anyway, as barely breaking 30fps min is not enough. You will see dips even on AMD while playing. They're only showing a snapshot here. They dropped textures to high at hardocp (the 2nd test) and NV won. So yeah if you want to push things to where we probably wouldn't enjoy it, AMD wins. Yay. But if you play at 1080p, the links above show NV winning. I think FAR more people are worried about 1080p. Having said that, this game would laugh at my PC...ROFL.