I wanted to test Watch Dogs in Surround and Eyefinity configurations, using three monitors, but wasn't able to get the aspect ratio correct using either GeForce or Radeon cards. Hopefully this is something the developers address in a patch. For now, we'll focus on single-monitor performance.

As always, we strive to represent results across a wide range of graphics hardware. Unfortunately, I don't have a GeForce GTX 780 Ti in our Canadian lab, so I did the next best thing and overclocked a GeForce GTX Titan to 780 Ti-like performance levels. This took a 1750 MHz memory clock rate (equal to the 780 Ti) and a 994 MHz GPU Boost clock to compensate for the CUDA core deficiency. I also set the power envelope to its maximum 106% setting in Afterburner. Resulting performance should come close to a GeForce GTX 780 Ti.
Drivers are a huge deal when a game first comes out, and we made it a point to test using the very latest software from both companies. Nvidia gave us early access to its 337.88 build, while AMD sent over Catalyst 14.6 Beta, both including optimizations for Watch Dogs. Indeed, AMD's performance went up quite a bit under the new driver; it was fairly dismal with 14.4.
Also, the texture detail setting is kept at Medium across our benchmarks, ensuring graphics memory doesn't affect our results too severely. The texture detail option is separate from the game's detail presets, so it's adjustable on its own.

Graphics cards like the Radeon R9 290X require a substantial amount of power, so XFX sent us its PRO850W 80 PLUS Bronze-certified power supply. This modular PSU employs a single +12 V rail rated for 70 A. XFX claims continuous (not peak) output of up to 850 W at 50 degrees Celsius.

We've almost exclusively eliminated mechanical disks in the lab, preferring solid-state storage for alleviating I/O-related bottlenecks. Samsung sent all of our labs 256 GB 840 Pros, so we standardize on these exceptional SSDs.
| Test System | |||||
|---|---|---|---|---|---|
| CPU | Intel Core i7-3960X (Sandy Bridge-E), 3.3 GHz, Six Cores, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading enabled. | ||||
| Motherboard | ASRock X79 Extreme9 (LGA 2011) Chipset: Intel X79 Express | ||||
| Networking | On-Board Gigabit LAN controller | ||||
| Memory | Corsair Vengeance LP PC3-16000, 4 x 4 GB, 1600 MT/s, CL 8-8-8-24-2T | ||||
| Graphics | GeForce GT 630 512 MB GDDR5 GeForce GTX 650 2 GB GDDR5 GeForce GTX 750 Ti 2 GB GDDR5 GeForce GTX 660 2 GB GDDR5 GeForce GTX 760 2 GB GDDR5 GeForce GTX 780 Ti 3 GB GDDR5 Radeon HD 6450 512 MB GDDR5 Radeon R7 240 1 GB DDR3 Radeon R7 250X 1 GB GDDR5 Radeon R7 260X 1 GB GDDR5 Radeon R9 270 2 GB GDDR5 Radeon R9 280 3 GB GDDR5 Radeon R9 290X 4 GB GDDR5 | ||||
| SSD | Samsung 840 Pro, 256 GB SSD, SATA 6Gb/s | ||||
| Power | XFX PRO850W, ATX12V, EPS12V | ||||
| Software and Drivers | |||||
| Operating System | Microsoft Windows 8 Pro x64 | ||||
| DirectX | DirectX 11 | ||||
| Graphics Drivers | AMD Catalyst 14.6 Beta, Nvidia GeForce 337.88 WHQL | ||||
| Benchmarks | ||||||||
|---|---|---|---|---|---|---|---|---|
| Watch Dogs | Custom Tom's Hardware Benchmark, 90-second Fraps run, Driving | |||||||
the most popular gaming cpu in the world.
Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?
+1 also on what @Patrick Tobin said.
I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.
Thank you!
does the game utilize SLI or Crossfire setup on PC?
Techspot did include those and no difference between I5 and I7 not even lga2011 hexa core!
The game looks beautiful.
and then the FX-8350 against a freaking i7-3960X and NO OTHER intel CPU. [edited for language]
For freak sakes i am really trying to follow you as a serios tech-site without bias,
please do not make it any freaking harder for me.
edit: Actually someone did some nice tests for CPUS:
CPU performance with GTX 780
CPU performance with R9 290X
I hope there is a followup article, focusing on some specific details. These include VRAM limitations, and more tweaking to see which settings changes most affect not only raw FPS but also smoothness. It looks like some settings lead to a very distracting experience, and it would be nice to know what those are.
Edit: Thanks, Don, for adding the FX-6300 and i5-3550; those are useful numbers to have. Here is one title where the FX clearly beats the i3, so core count must matter.
Claiming something wins where 98% of us NEVER play is ridiculous. You want to know who wins in 98% of users cases. Those fps are too low for me anyway, as barely breaking 30fps min is not enough. You will see dips even on AMD while playing. They're only showing a snapshot here. They dropped textures to high at hardocp (the 2nd test) and NV won. So yeah if you want to push things to where we probably wouldn't enjoy it, AMD wins. Yay. But if you play at 1080p, the links above show NV winning. I think FAR more people are worried about 1080p. Having said that, this game would laugh at my PC...ROFL.