Watch Dogs is built on the new Disrupt engine, which evolved from pieces of Assassin's Creed's AnvilNext engine (for city mechanics) and Far Cry 3's Dunia engine (for vegetation and AI). It's a good-looking game at low-quality details and much more impressive with all of the bells and whistles enabled. But it's no CryEngine when it comes to pushing the visual envelope.

The benchmarks will show that this game taxes PC hardware. Even the most entry-level settings require respectable components to run smoothly, and high frame rates can be disrupted by hiccups now and again, which is distracting. We tested using more advanced graphics processors that anything you'll find in a console, so I have to guess that the PC version isn't as finely-optimized as the eight-gen console builds have to be.

Given such a strenuous demand on hardware at the higher detail settings, I chose to test with post-process anti-aliasing. Also, I selected FXAA instead of the game's advanced SMAA setting. While SMAA improved aliasing on objects, it didn't do as good of a job with artifacts on textures, which were particularly distracting on the road when driving.

Now that we know how Watch Dogs looks, let's see how the game performs on a variety of graphics cards and processors.
the most popular gaming cpu in the world.
Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?
+1 also on what @Patrick Tobin said.
I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.
Thank you!
does the game utilize SLI or Crossfire setup on PC?
Techspot did include those and no difference between I5 and I7 not even lga2011 hexa core!
The game looks beautiful.
and then the FX-8350 against a freaking i7-3960X and NO OTHER intel CPU. [edited for language]
For freak sakes i am really trying to follow you as a serios tech-site without bias,
please do not make it any freaking harder for me.
edit: Actually someone did some nice tests for CPUS:
CPU performance with GTX 780
CPU performance with R9 290X
I hope there is a followup article, focusing on some specific details. These include VRAM limitations, and more tweaking to see which settings changes most affect not only raw FPS but also smoothness. It looks like some settings lead to a very distracting experience, and it would be nice to know what those are.
Edit: Thanks, Don, for adding the FX-6300 and i5-3550; those are useful numbers to have. Here is one title where the FX clearly beats the i3, so core count must matter.
Claiming something wins where 98% of us NEVER play is ridiculous. You want to know who wins in 98% of users cases. Those fps are too low for me anyway, as barely breaking 30fps min is not enough. You will see dips even on AMD while playing. They're only showing a snapshot here. They dropped textures to high at hardocp (the 2nd test) and NV won. So yeah if you want to push things to where we probably wouldn't enjoy it, AMD wins. Yay. But if you play at 1080p, the links above show NV winning. I think FAR more people are worried about 1080p. Having said that, this game would laugh at my PC...ROFL.