Benchmarked: How Well Does Watch Dogs Run On your PC?

How We Tested Watch Dogs

I wanted to test Watch Dogs in Surround and Eyefinity configurations, using three monitors, but wasn't able to get the aspect ratio correct using either GeForce or Radeon cards. Hopefully this is something the developers address in a patch. For now, we'll focus on single-monitor performance.

As always, we strive to represent results across a wide range of graphics hardware. Unfortunately, I don't have a GeForce GTX 780 Ti in our Canadian lab, so I did the next best thing and overclocked a GeForce GTX Titan to 780 Ti-like performance levels. This took a 1750 MHz memory clock rate (equal to the 780 Ti) and a 994 MHz GPU Boost clock to compensate for the CUDA core deficiency. I also set the power envelope to its maximum 106% setting in Afterburner. Resulting performance should come close to a GeForce GTX 780 Ti.

Drivers are a huge deal when a game first comes out, and we made it a point to test using the very latest software from both companies. Nvidia gave us early access to its 337.88 build, while AMD sent over Catalyst 14.6 Beta, both including optimizations for Watch Dogs. Indeed, AMD's performance went up quite a bit under the new driver; it was fairly dismal with 14.4.

Also, the texture detail setting is kept at Medium across our benchmarks, ensuring graphics memory doesn't affect our results too severely. The texture detail option is separate from the game's detail presets, so it's adjustable on its own.

Graphics cards like the Radeon R9 290X require a substantial amount of power, so XFX sent us its PRO850W 80 PLUS Bronze-certified power supply. This modular PSU employs a single +12 V rail rated for 70 A. XFX claims continuous (not peak) output of up to 850 W at 50 degrees Celsius.

We've almost exclusively eliminated mechanical disks in the lab, preferring solid-state storage for alleviating I/O-related bottlenecks. Samsung sent all of our labs 256 GB 840 Pros, so we standardize on these exceptional SSDs.

Swipe to scroll horizontally
Header Cell - Column 0 Test System
CPUIntel Core i7-3960X (Sandy Bridge-E), 3.3 GHz, Six Cores, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading enabled.
MotherboardASRock X79 Extreme9 (LGA 2011) Chipset: Intel X79 Express
NetworkingOn-Board Gigabit LAN controller
MemoryCorsair Vengeance LP PC3-16000, 4 x 4 GB, 1600 MT/s, CL 8-8-8-24-2T
GraphicsGeForce GT 630 512 MB GDDR5GeForce GTX 650 2 GB GDDR5GeForce GTX 750 Ti 2 GB GDDR5GeForce GTX 660 2 GB GDDR5GeForce GTX 760 2 GB GDDR5GeForce GTX 780 Ti 3 GB GDDR5Radeon HD 6450 512 MB GDDR5Radeon R7 240 1 GB DDR3Radeon R7 250X 1 GB GDDR5Radeon R7 260X 1 GB GDDR5Radeon R9 270 2 GB GDDR5Radeon R9 280 3 GB GDDR5Radeon R9 290X 4 GB GDDR5
SSDSamsung 840 Pro, 256 GB SSD, SATA 6Gb/s
PowerXFX PRO850W, ATX12V, EPS12V
Software and Drivers
Operating SystemMicrosoft Windows 8 Pro x64
DirectXDirectX 11
Graphics DriversAMD Catalyst 14.6 Beta, Nvidia GeForce 337.88 WHQL
Swipe to scroll horizontally
Benchmarks
Watch DogsCustom Tom's Hardware Benchmark, 90-second Fraps run, Driving
  • coolcole01
    Running on my system with ultra and highest settings and fxaa it is pretty steady at 60-70 fps with weird drops randomly almost perfectly to 30 then up to 60 almost like adaptive sync is on, Currently playing it withe the texture at high and hba0+ and smaa and its a pretty rock steady 60fps with vsync still with the random drops.
    Reply
  • coolcole01
    definitely does not like to run up the vram
    Reply
  • edwinjr
    why no core i5 3570k in the cpu benchmark section?
    the most popular gaming cpu in the world.
    Reply
  • chimera201
    So a Core i5 is enough compared to Ubisoft's recommended system requirement of i7 3770
    Reply
  • jonnyapps
    What speed is that 8350 tested at? Seems silly not to test OC'd as anyone on here with an 8350 will have it at at least 4.6
    Reply
  • Patrick Tobin
    Most 780Ti cards come with 3GB of ram, the Titan has 6GB. This is an unfair comparison as the Titan has more than ample VRAM. Get a real 780Ti or do not label it as such. HardOCP just did the same tests and the 290X destroyed the 780 since the FSAA + Ultra textures started causing swapping since it was pushing past 3GB.
    Reply
  • tomfreak
    If u dont have 780ti, 780, just show us stock Titan speed, Why would u rather show us Titan OCed speed than showing Titan stock speed & all that without showing 290X OCed speed? Infact an OCed Titan does not represent a 780Ti, because it has 6GB VRAM. Vram is a big deal in watchdog. So ur Oced titan does not look like 780ti nor a real titan.
    Reply
  • AndrewJacksonZA
    Hi Don

    Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?

    +1 also on what @Patrick Tobin said.

    I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.

    Thank you!
    Reply
  • Lee Yong Quan
    why doesnt you have the high detail setting? and would a 7790 1gb perform the same as 260x 2gb in medium texture? if not which is better
    Reply
  • chimera201
    We need more variety of CPUs
    Reply