Watch Dogs 2 Performance Benchmarks

Here's a fact: Watch Dogs 2 doesn’t run that well without decent hardware. Our focus today isn't on the story. Rather, we're trying to make the game look as attractive as possible. We knew this project would involve multiple Tom's Hardware labs, so we didn't want to simply run the same benchmarks as everyone else. Hopefully you find our extra effort worth the wait.

Once you complete the lengthy and completely linear intro level, it’s time to enter Watch Dogs 2’s open world. The game’s first outing with its far-reaching views from the Golden Gate Bridge is both beautiful and a real challenge for your hardware.

We actually played through the introduction twice based on different saved games. Multi-player adds variance that can't be controlled for, so our tests are run exclusively in offline mode. The benchmarks also benefit from a God Mode tool. Consistent numbers are a lot harder to achieve if you keep getting killed, after all.

Preset or Custom Settings?

Watch Dogs 2 has a ton of graphics settings to experiment with. If you love to tweak every last detail by playing with individual options and restarting the game a million times, then you’ll have a blast in the menu system.

Then again, not every player goes to the trouble of finding a perfect balance between performance and graphics quality for their specific hardware configuration. The game accommodates this as well by offering no less than five presets, in addition to the usual range of resolutions. A sixth preset is reserved for custom settings.

The Low and Medium presets really don’t look good. Use them only if every other combination of detail options is unplayable. Visual fidelity gets better from the High preset and up. The gaming world comes to life, with Very High and Ultra only providing minor quality improvements. Those two latter modes also demand a lot more from your hardware.

Low SettingsLow SettingsMedium SettingsMedium SettingsHigh SettingsHigh Settings

  

Very High SettingsVery High SettingsUltra SettingsUltra SettingsManually Maxed OutManually Maxed Out

   

Ultimately, your best bet is choosing the preset that yields frame rates you're comfortable with, and then fine-tuning from there. Everybody has their own preferences when it comes to graphics, and Watch Dogs 2 provides the tools to experiment and adjust everything to your specific tastes.

MORE: All Gaming Content

Anti-Aliasing & Performance

Beauty comes at a price. Watch Dogs 2’s open world looks fantastic and offers a lot of variety, but the detailed environments and high-resolution textures also suffer from flickering jagged edges that are on full display as soon as you start moving around. Fortunately, there are several anti-aliasing settings, some of which can be combined, to alleviate this issue.

Gamers with lower-end hardware or a relatively small amount of graphics memory won’t be able to get around SMAA or FXAA post-processing. Temporal filtering can be activated as well, which provides an upscaled lower-resolution picture. The result’s fairly muddy, but this setting is easier for your graphics card to handle and it does help combat aliasing artifacts. Just don't expect the best visual fidelity.

GeForce owners can use TXAA (2x/4x/8x), which practically eliminates the flickering. There’s a drawback though: the picture becomes a bit blurry. As usual, finding the right compromise is key. Dial in the lowest TXAA setting and adjust the sharpness to taste. Powerful hardware helps a lot, of course.

Another option is sticking with the old-school approach: MSAA (2x to 8x), though that comes with its own set of downsides. Really, there's no perfect solution. Perhaps enabling SMAA and calling it a day is preferable, which brings us back to the beginning of our discussion.

Shadows & Ambient Occlusion

Watch Dogs 2 makes gratuitous use of light, which casts a lot of shadows. And as is often the case with new games, Nvidia's GameWorks libraries aren't far behind. No matter how you feel about them, Watch Dogs 2 implements not one, but two GameWorks shadow features.

The first is called Hybrid Frustum Traced Shadows (HFTS), and it's Nvidia-exclusive. It’s supposed to combine different sharp and soft shadow geometries with each other in a more realistic manner. Be forewarned: it's a hardware-killer, and only available on second-generation Maxwell- and Pascal-based graphics cards.

With Percentage-Closer Soft Shadows (PCSS), the experience of seeing your GPU run out of steam isn't limited to GeForce owners. Anyone with a Radeon card can join in the fun as well. At least you get a more accurate shadow for your troubles, particularly when the geometries have different levels of softness. The feature isn’t perfect, which is to say that there are still errors, but it certainly does look nice.

If you don't feel the need to take realism to its extreme, stick with the Ultra quality preset. In some cases, it's actually better-looking since the picture quality can be sharper.

HFTS (Nvidia Exclusive)HFTS (Nvidia Exclusive)PCSS (Gameworks)PCSS (Gameworks)UltraUltra

    

HighHighMediumMediumLowLow

     

Watch Dogs 2’s shadow quality necessitates ambient occlusion that’s as realistic as possible. It makes the picture much more vivid and three-dimensional. This means that, as usual, this option should be turned on. The game offers several different versions. Apart from its proprietary ones, there’s HBAO+, which runs the same on Nvidia and AMD hardware and is a bit too strong for our taste. We prefer SSBC or HMSSAO, depending on your hardware and detail preferences.

Original Textures & Texture DLC

We decided to skip past the high-resolution texture DLC in our benchmarks for two reasons. First, it requires at least 6GB of graphics memory, which excludes a lot of very relevant cards. Second, we wanted to test a wide swathe of the graphics market, and including the DLC as an additional option would have added a ton of time to the benchmarking.

The original textures are decent, but they do get blurry up close. They lack a certain crispness. Just think twice about downloading the high-res texture pack if you aren't rocking at least an upper-mid-range GPU. Filling up the on-card GDDR5/HBM will cut into the quality options you can enable at playable frame rates.

The texture settings you get from each preset are perfect without the DLC. Adjust them only if you have headroom to spare.

Memory Requirements

Add up all of the settings’ memory requirements, and you get Watch Dogs 2’s overall footprint. If you max out all of the options, which is to say you set the game to its Ultra preset and then manually crank up the settings not already max'ed out, then you technically exceed the capacity of even Nvidia's mighty GeForce GTX 1080 with 8GB of GDDR5X.

The table below is based on Watch Dogs 2’s own calculations. Initially, these seemed somewhat high compared to the results of several measurement tools, but it turned out that some of those were guessing too low for lengthy play sessions. These numbers give a good estimate of how much memory a graphics card should have.

Two Averaged Benchmarks Make for Optimal Results

We created and tested two benchmark sequences. Each of them took a while to complete, deliberately canceling out small fluctuations. We learned to use a methodology like this for open-world games from our experiences with GTA V. This also makes sense statistically, since variations tend to balance out over time. We did test a number of different benchmark times and found that anything under 1:30 wasn't consistent and couldn't be reproduced reliably.

Our first benchmark run consists of a 1:50-minute sprint through the countryside. It's harder on the graphics card than originally anticipated; the load is actually similar to city driving. There are barely any NPCs or vehicles on this route, making the results easy to reproduce. CPU utilization is significantly lower than in places with a dense population or many vehicles. This means that graphics is the sole bottleneck.

Bechmark 1 - GPU Related

We're using a 1:40-minute bike ride for our second benchmark run in order to gauge how the interaction between GPU, graphics driver, and CPU changes the outcome. We chose a bike instead of a car so that we’d have an easier time getting through traffic without any collisions. The CPU load was significantly higher than it was on the Golden Gate Bridge. It was also easy to determine which graphics drivers utilize CPU resources well.

Benchmark 2 - CPU Related

For the mean results, we simply averaged the two individual frame rates. The minimum number represents the lower of the two runs (without exception, the high-traffic test was most taxing).

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Create a new thread in the US Reviews comments forum about this subject
34 comments
    Your comment
  • amk-aka-Phantom
    Great and thorough testing, thank you very much. But since you said yourself that High seems to be the sweet spot whereas Very High and Ultra barely look better yet demand a lot more from your hardware, why don't we get 1920x1080/High benchmarks then? I'd love to see how my 970 does. I can sort of deduce that it'll be alright considering it does 35-44 on Very High and VRAM consumption is within its limits, but a confirmation would be nice.
    3
  • cwolf78
    I agree. Please add the High preset benchmark including the 970. This is the setting I'd probably use if I were to purchase it.
    0
  • Vaibhav_5
    1080p high for r9 270x please. On the verge of buying in the steam sale.
    0
  • Elysian890
    Getting 60 w/ texture DLC on High setting preset, 290x Devil 13

    stable 60 without DLC on very high to ultra, 1080p, vsync enabled
    0
  • cknobman
    Its a rather poorly optimized game TBH.
    Digital Foundry had big time issues getting this game to play well at high settings even using a Titan X.

    You can also see that this game relies on gfx memory more than anything by the RX 480 with 8GB Ram outpacing more powerful cards with less RAM (like the Fury X).
    2
  • firefoxx04
    Who cares. Didn't they screw people with the original game? Fool me once..
    1
  • IceMyth
    I don't understand one point, how Ultra settings have better FPS then Very high settings?
    0
  • freak777power
    Go test under Windows 7, you will get much better result.
    0
  • FormatC
    Anonymous said:
    I don't understand one point, how Ultra settings have better FPS then Very high settings?
    Take a look at the resolution ;)

    Anonymous said:
    Go test under Windows 7, you will get much better result.

    Windows 7 is dead. Only for a few games the testing on two different systems makes no sense. This review here was alone a benchmark session over 19 working hours. I'm not testing only one run, but many - depending at the possible tolerance. I write per month around 8 reviews - from CPU, GPU and Workstation to PC-Audio and some investigative ones.
    4
  • freak777power
    Windows 7 is not dead as performs better in gaming and supports everything. Also Windows 10 holds only 22% of the Windows Market Share, the rest for the most part is Windows 7. I think it is rather unprofessional not to include Windows 7 in test, and just to get that bullshit argument out of the way saying Win10 is better for the gaming. It would be fair to people to show that it is not what MS. claims as many f. up their systems doing upgrade to Win10 based on false advertisement.
    -8
  • freak777power
    In the 19 hours you could do same against both OS, Win10, and Win7 as you can run tests at the same time.
    -5
  • 10tacle
    Anonymous said:
    I think it is rather unprofessional not to include Windows 7 in test, and just to get that [deleted] argument out of the way saying Win10 is better for the gaming. It would be fair to people to show that it is not what MS. claims as many f. up their systems doing upgrade to Win10 based on false advertisement.

    In the 19 hours you could do same against both OS, Win10, and Win7 as you can run tests at the same time.


    First of all, clean up your language. Second, if you think it's so easy and non-time consuming to test every real world combination of hardware and OS, then start your own tech website and do it. :pfff:
    3
  • freak777power
    My language was just fine. Yes it is easy if it's done right.
    -5
  • freak777power
    To prove what i was saying it was enough to take only one setup running Nvidia 1080, high end hardware generally speaking, and run against Win7 and Win10. You will see rather an interesting result from which a conclusion can be projected on other hardware configurations. That is called being smart approach.
    -1
  • yyk71200
    Anonymous said:
    Windows 7 is not dead as performs better in gaming and supports everything. Also Windows 10 holds only 22% of the Windows Market Share, the rest for the most part is Windows 7. I think it is rather unprofessional not to include Windows 7 in test, and just to get that bullshit argument out of the way saying Win10 is better for the gaming. It would be fair to people to show that it is not what MS. claims as many f. up their systems doing upgrade to Win10 based on false advertisement.


    Hardly better in gaming: http://www.techspot.com/review/1042-windows-10-vs-windows-8-vs-windows-7/page6.html
    0
  • 10tacle
    ^^Uhm, no...saying b******t is NOT "just fine" on Tom's Hardware. Maybe a Moderator will help you better understand.

    Now back to the subject, I'm seeing a very troubling trend with newer games struggling to average 60FPS (with ~40FPS minimum dips) at 1440p on ultra settings with the fastest single GPU (GTX 1080). The Dues Ex reboot is another. If this keeps up, I will start buying more new games for the PS4 than PC which will be a first in my 20 years of PC building and gaming.

    Just as a memory refresher from May 2014, he's what WD1 did on ultra with a 780Ti at 1440p (a 970 is about equal to a 780Ti at 1440p):

    http://www.techspot.com/articles-info/827/bench/Ultra_03.png

    A GTX 1060 is about 10% faster than a 970 and only averages 35FPS in WD2 on Ultra at 1440p. A 40% drop in relative GPU performance in just two and a half years between game series is significant. I don't recall that level of performance drop between Crysis 2-3, Far Cry 3-4, Battlefield 3-4.
    0
  • FormatC
    Anonymous said:
    To prove what i was saying it was enough to take only one setup running Nvidia 1080, high end hardware generally speaking, and run against Win7 and Win10. You will see rather an interesting result from which a conclusion can be projected on other hardware configurations. That is called being smart approach.

    You ever played with legal software from Ubisoft? It is very tricky, to start the system every time with other cards and to NOT LOSE your license. After each 5th change you will normally stopped and kicked out. I can help me with starting Uplay each time with the same primary VGA card and switch after the initialization to another card. But it is NOT POSSIBLE to run the same game on two systems, Mr. Perfect.

    The next point is that the benchmark is running MANUALLY and I have not four hands to run it twice at the same time. The difference between W10 and W7 is at the end so marginally, that it is a pure waste of time. I mentioned, that a lot of games are running better with W10 (multi-tasking) if the setup is well-done and the spyware isn't working (I deactivate a lot of services like the update service during benchmarks). This can be scripted very easy with the task-planer.

    And between you and me:
    Please use the edit function and prevent us before so much double posts und some bad words ;)
    2
  • 10tacle
    Anonymous said:


    ^^A link from August 10, 2015 ... seriously? You are providing a link from nearly a year and a half ago when drivers were still not mature for Win10 to show there's little improvement of Win10 over Win7? This of course doesn't even bring into consideration that many of the new games require Win10 to get all out of the game it has to offer (eye candy options-DX12).
    0
  • yyk71200
    Anonymous said:
    Anonymous said:


    ^^A link from August 10, 2015 ... seriously? You are providing a link from nearly a year and a half ago when drivers were still not mature for Win10 to show there's little improvement of Win10 over Win7?




    Thats not what I am saying. I am saying that windows 7 is not better for gaming than windows 10.

    EDIT: Anyway, nobody bothers now with windows 7 benchmarks anymore.
    1
  • FormatC
    Anonymous said:

    EDIT: Anyway, nobody bothers now with windows 7 benchmarks anymore.
    This is the reason why I wrote that W7 is dead. It is the same as with XP a few years ago.
    Call it déjà vu and the next generation is doing simply the same: wailing. :D

    If I find the time I'll compare the OS'es again. And BTW: I had a similar problem with all my workstation tests and the transfer of the licenses back to W7/W8. Mostly all pro-apps are performing better on W10, expecially if you can use 10 cores with up to 20 threads...
    0