Sign in with
Sign up | Sign in
Your question

Benchmarked: How Well Does Watch Dogs Run On your PC?

Tags:
  • Gaming
  • Graphics Cards
  • CPUs
  • Software
Last response: in Reviews comments
Share
May 27, 2014 11:38:26 PM

Watch Dogs is one of the most anticipated games of 2014. So, we're testing it across a range of CPUs and graphics cards. By the end of today's story, you'll know what you need for playable performance. Spoiler: this game is surprisingly demanding!

Benchmarked: How Well Does Watch Dogs Run On your PC? : Read more


******* EDIT: More CPU Results On The Way *******

1- I'm seeing a lot of requests for more CPUs. I totally understand.
Truth be told, I would have liked to add a lot more configs for publication but we were really under the gun here to get something out as quick as possible. I just didn't have the time.

I will try to take FX-6300 and Core i5-3550 benchmarks today and add the results to the charts, though. Stay tuned. :) 

[UPDATE May 28] FX-6300 and Core i5-3550 added to CPU benchmark charts [/UPDATE]


2- As for the simulated 780 Ti, I simply don't have a real one here, I asked Nvidia but they ignored the request. But its worlds more relevant than the $1000 Titan I have onhand. I'd rather have an extremely close approximation to the 780 Ti than nothing at all. As far as the VRAM, I was clear in the test setup that I benched the medium texture setting to keep VRAM out as a variable. The game makes it clear which texture setting should be used with the amount of VRAM you have.

3- 4k results would be nice, and we're working on sourcing monitors for our labs. Having said that, the adoption is less than a tenth of one percent. It's not an important inclusion yet, but it'll get there. We're working on it.

- Don



More about : benchmarked watch dogs run

a b 4 Gaming
a b U Graphics card
a c 84 à CPUs
May 28, 2014 12:03:22 AM

Running on my system with ultra and highest settings and fxaa it is pretty steady at 60-70 fps with weird drops randomly almost perfectly to 30 then up to 60 almost like adaptive sync is on, Currently playing it withe the texture at high and hba0+ and smaa and its a pretty rock steady 60fps with vsync still with the random drops.
m
0
l
a b 4 Gaming
a b U Graphics card
a c 84 à CPUs
May 28, 2014 12:03:45 AM

definitely does not like to run up the vram
m
0
l
Related resources
May 28, 2014 1:15:57 AM

why no core i5 3570k in the cpu benchmark section?
the most popular gaming cpu in the world.
m
16
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 1:31:16 AM

So a Core i5 is enough compared to Ubisoft's recommended system requirement of i7 3770
m
7
l
May 28, 2014 1:32:30 AM

What speed is that 8350 tested at? Seems silly not to test OC'd as anyone on here with an 8350 will have it at at least 4.6
m
-12
l
May 28, 2014 2:25:39 AM

Most 780Ti cards come with 3GB of ram, the Titan has 6GB. This is an unfair comparison as the Titan has more than ample VRAM. Get a real 780Ti or do not label it as such. HardOCP just did the same tests and the 290X destroyed the 780 since the FSAA + Ultra textures started causing swapping since it was pushing past 3GB.
m
24
l
May 28, 2014 2:35:07 AM

If u dont have 780ti, 780, just show us stock Titan speed, Why would u rather show us Titan OCed speed than showing Titan stock speed & all that without showing 290X OCed speed? Infact an OCed Titan does not represent a 780Ti, because it has 6GB VRAM. Vram is a big deal in watchdog. So ur Oced titan does not look like 780ti nor a real titan.
m
28
l
May 28, 2014 2:43:19 AM

Hi Don

Please could you include tests at 4K resolution, and also please use a real 780Ti and also a 295X2? Can you not ask another lab to do it, or get one shipped to you please?

+1 also on what @Patrick Tobin said.

I can appreciate that you might've spent a lot of time on this review, and we'd really appreciate you doing the final bit of this review. I know that not a lot of gamers currently game at 4K, but I am definitely interested in it please.

Thank you!
m
8
l
May 28, 2014 2:45:11 AM

why doesnt you have the high detail setting? and would a 7790 1gb perform the same as 260x 2gb in medium texture? if not which is better
m
1
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 3:10:26 AM

We need more variety of CPUs
m
17
l
May 28, 2014 3:34:39 AM

anyone know if Watch Dogs have SLI profile?

does the game utilize SLI or Crossfire setup on PC?
m
0
l
May 28, 2014 3:53:58 AM

I usually use toms as my definitive sit for performance benchmarks but the lack of variety in both cpus and gpus here is really disappointing, especially for this being the first "next gen" game
m
5
l
a c 296 4 Gaming
a c 1427 U Graphics card
a c 912 à CPUs
May 28, 2014 3:55:55 AM

This needs to be redone to see if it actually needs 8 thread capable CPU so I5 and I7 on a lga 1150/1155 should have been included!
Techspot did include those and no difference between I5 and I7 not even lga2011 hexa core!
m
3
l
a b 4 Gaming
a c 130 U Graphics card
a b à CPUs
May 28, 2014 4:02:41 AM

2GB or more VRAM is required when running at MSAA x8 (this requirement appears when you turn on the MSAA).
The game looks beautiful.
m
2
l
May 28, 2014 4:06:21 AM

You guys are being a bit unnecessary regarding the inclusion of the Titan OC'd to simulate the 780Ti - he simply used what he HAD. I think the choice to use medium textures renders the 6GB VRAM vs 3GB VRAM mostly moot. This was just to give us an indication, why do people have to get so darned technical all the time? You guys should really try to wrap your head around the various scenarios to be tested and the time it takes to be done before you give the Authors grief about "limited this and limited that". The game looks good, thanks for the brief review
m
2
l
May 28, 2014 4:12:58 AM

First you put the R9 290X(CATA 14.6?) without OC against the Titan with OC,
and then the FX-8350 against a freaking i7-3960X and NO OTHER intel CPU. [edited for language]
For freak sakes i am really trying to follow you as a serios tech-site without bias,
please do not make it any freaking harder for me.
m
3
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 4:51:42 AM

I wanted to know how a 770/760 4gb edition performs on this, the 760 already performs great on 1080p though
m
1
l
a b 4 Gaming
a c 204 U Graphics card
a c 150 à CPUs
May 28, 2014 4:54:33 AM

I would have liked to have seen more CPUs tested, in particular three that are widely discussed and recommended in the forums, the i5-4670K (or i5-3570K), FX-6300, and 760K.
I hope there is a followup article, focusing on some specific details. These include VRAM limitations, and more tweaking to see which settings changes most affect not only raw FPS but also smoothness. It looks like some settings lead to a very distracting experience, and it would be nice to know what those are.
Edit: Thanks, Don, for adding the FX-6300 and i5-3550; those are useful numbers to have. Here is one title where the FX clearly beats the i3, so core count must matter.

m
6
l
a b U Graphics card
May 28, 2014 5:27:19 AM

Take a look at the links the OP gave with 780 vs. 290x. 290x lost. Not sure what all the whining is about. 290x has more ram than a 780 right? Who cares above this res when only 2% are using over 1080p?

Claiming something wins where 98% of us NEVER play is ridiculous. You want to know who wins in 98% of users cases. Those fps are too low for me anyway, as barely breaking 30fps min is not enough. You will see dips even on AMD while playing. They're only showing a snapshot here. They dropped textures to high at hardocp (the 2nd test) and NV won. So yeah if you want to push things to where we probably wouldn't enjoy it, AMD wins. Yay. But if you play at 1080p, the links above show NV winning. I think FAR more people are worried about 1080p. Having said that, this game would laugh at my PC...ROFL.
m
5
l
a b 4 Gaming
a b U Graphics card
a c 145 à CPUs
May 28, 2014 5:34:54 AM

Interesting how Don recommends a minimum of i5 or FX6300 but did not include those in the CPU scaling benchmarks. There should be a LGA115x i7 in there too for a smooth progression from 2C4T to 6C12T - the Extreme CPU has over 3X the i3's raw performance but only manages twice the score; it would have been interesting to see a smooth progression on how much benefit it gets out of extra threads vs extra cores.
m
6
l
a b 4 Gaming
a c 204 U Graphics card
a c 150 à CPUs
May 28, 2014 5:40:02 AM

I must say, I much prefer Tom's video game reviews to what HardOCP does. As much as I enjoy HardOCP's PSU reviews and believe they are well-done, their video game reviews seem devoted strictly to [near] top of the line hardware running at UltraMaxOhWOW! settings that are absolutely irrelevant to the average gamer. I want to see a lot more data points than that. Yes, more would be nice, but at least here we do get enough data for some interpolation/extrapolation to alternate hardware.
m
9
l
a b U Graphics card
a b à CPUs
May 28, 2014 5:49:33 AM

how are 4 core intel processors, both with and without hyperthreading NOT on the cpu benchmarks. it has been claimed that hyperthreading is required for ultra but this wasnt even tested? for all we know the high end intel processor you did test performs no better than an i5.

considering this game is cpu bound, how is there not a more comprehensive cpu benchmarking being done? what a waste.
m
4
l
a b 4 Gaming
a b U Graphics card
a c 99 à CPUs
May 28, 2014 6:12:22 AM

I'm on AMD, but there are a LOT of gaming rigs and gamers out there using i5 3570k/4670k and i7 3770k/4770k. I was surprised not to see these tested. That would have been some very useful information for a lot of people. :( 

Is there any way the benchmark charts could be updated to show results for these parts?
m
2
l
May 28, 2014 6:35:31 AM

UHD?
m
-2
l
a b 4 Gaming
a b à CPUs
May 28, 2014 7:01:36 AM

Its pretty good optimizated i5 3350p gt 640 8 ram 1600mhz 30-40 fps medium-low 1440x900 res only the gpu overheat but its normal just wanna ask how well it will perform with msaa 4x i5 3350p gtx 770 msi twin frozr?
m
0
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 7:04:59 AM

No 770 or 780?
m
2
l
a b U Graphics card
a b à CPUs
May 28, 2014 7:08:43 AM

The closest thing to a true quad-core in this CPU benchmark is an FX-4170. You guys didn't have anything better to test with than that? Nothing at all between the FX-8350 and i7-3960X?

Okay, so maybe you don't have the hardware available in the Canadian office. So why not at least give us a couple of separate graphs for clock speed scaling and core count scaling? By your own admission, this is a CPU-intensive game, and one that (according to Ubisoft's system requirements) can use a lot of cores. The sole graph provided tells us very, very little except that the FX-8350 and i7-3960X "do okay."
m
1
l
a b à CPUs
May 28, 2014 7:20:15 AM

We want the game to be benchmarked on a wide variety of CPUs. And where is R9 280X and GTX 770. And it seems like this game is poorly optimized. Well, it's from Ubisoft. What can we expect?
m
3
l
May 28, 2014 7:44:46 AM

I'm seeing a lot of requests for more CPUs. I totally understand.
Truth be told, I would have liked to add a lot more configs for publication but we were really under the gun here to get something out as quick as possible. I just didn't have the time.

Today I added the Core i5-3550 and FX-6300 to the benchmark charts in an update. :) 
m
7
l
a b 4 Gaming
a c 204 U Graphics card
a c 150 à CPUs
May 28, 2014 7:51:53 AM

Thanks! Those numbers are interesting; the FX-6300 clearly beats the i3, so core count must make a pretty big difference in this game.
m
3
l
a b U Graphics card
a b à CPUs
May 28, 2014 8:05:42 AM

xD
m
0
l
a b 4 Gaming
a b U Graphics card
a c 145 à CPUs
May 28, 2014 8:08:56 AM

redgarl said:
UHD?

With high-end GPUs only managing 70ish FPS at QHD, I would expect only 30ish FPS at QHD resolutions which would not feel particularly pleasant to most enthusiasts. Multi-GPU setups would be required but either Don did not have any on-hand for this review or there might be a separate article covering that coming later.
m
0
l
a b U Graphics card
a b à CPUs
May 28, 2014 8:15:02 AM

im running at ultra 1080p with two sli 670s and a 3770k at 60fps, when i switch to 5760x1080 i can get about 40-50 fps on high
m
0
l
a c 296 4 Gaming
a c 1427 U Graphics card
a c 912 à CPUs
May 28, 2014 8:17:26 AM

Great to see the added CPU's in there and an I5 actually competing with the FX8350 with similar results proofing that the recommended system specs were over stated.
m
2
l
May 28, 2014 8:22:14 AM

cant we get a test with a normal pc spec and settings, like i5 4670, 780. no aa crap and framebuffer at 1080p, probably 1gb vram usage on that.
m
1
l
May 28, 2014 8:34:24 AM

harly2 said:
another Tom's article that gets different results from the rest of the tech community, and tends to have Nvidia leanings


Your comment is too general to help you with.

What do you mean by "Nvidia leaning", exactly?

What specific result are you talking about that the 'rest of the tech community' has produced conflicting data with?

m
3
l
May 28, 2014 8:37:31 AM

Plusthinking Iq said:
no aa crap and framebuffer at 1080p, probably 1gb vram usage on that.


FXAA is a realtime post process filter. From what i understand, it doesn't use VRAM like MSAA.

m
1
l
a b à CPUs
May 28, 2014 8:47:15 AM

This game runs like absolute POO on PC. And I have a GTX780 and a 3570k@4.7Ghz with 16GB of 2133 RAM and a 500GB EVO SSD.

It's a stuttering mess. The Vsync doesn't even work so you get tons of screen tearing too.
m
3
l
May 28, 2014 8:47:18 AM

And so this is the glorious victory of multi-core scaling that AMD fanboys promised us! Where devs would finally optimise for "MOAR CORES" and let the FX-8350 unleash its full potential and power! Its power to....match a lower-clocked ivy bridge I5. Well done I guess?
m
1
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 9:00:22 AM

Thanks for this fast response! Adding two popular budget CPUs was essential. I would like to see an LGA115x i7 though...
Anyway keep up the good work ^^.
m
0
l
May 28, 2014 9:13:50 AM

You do not need hyper threading to play at Ultra settings.

I have a i5-4670K with a gtx780 and 16gb ram. I run ultra at 2560x1440 and it is smooth with no lag or stuttering.
m
1
l
a b 4 Gaming
a b à CPUs
May 28, 2014 9:14:02 AM

ericjohn004 said:
This game runs like absolute POO on PC. And I have a GTX780 and a 3570k@4.7Ghz with 16GB of 2133 RAM and a 500GB EVO SSD.

It's a stuttering mess. The Vsync doesn't even work so you get tons of screen tearing too.

You get low fps with this system? ffs dude this is freaking 780 and 16 ram 2133mhz lol that's the first 2133 i saw in this forum daymn i can't belive..
m
4
l
a b U Graphics card
May 28, 2014 9:17:07 AM

Onus said:
Thanks! Those numbers are interesting; the FX-6300 clearly beats the i3, so core count must make a pretty big difference in this game.


Agreed..Which is why this game just laughs at my dual core. C'mon Intel get the dang broadwell's out the door, my PC has been crying for long enough...LOL. ;) 
m
0
l
May 28, 2014 9:20:51 AM

harly2 said:

Oh god you again....Other tech sites consistency get better results from AMD hardware, I'm not linking it for you, simply go to one of the multitude of other tech sites pick an AMD product they have reviewed look at the benchmarks and see how it compares to the toms reviews.


"Oh god you again"...same old Harley.

Still stirring the pot with baseless accusations and completely unable to back it up with inconvenient stuff like facts, huh?

How's that working out for ya? :D 

m
6
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 9:24:49 AM

harly2 said:
Other tech sites consistency get better results from AMD hardware, I'm not linking it for you, simply go to one of the multitude of other tech sites pick an AMD product they have reviewed look at the benchmarks and see how it compares to the toms reviews. Toms has a negative tone with AMD products and also misrepresents them on CPU and GPU charts. This is just another example.

[/b]

Where did you see this?
Guru3D has lower results than Tom's and Techspot used even factory o/c HIS AMD Radeons and the results were barely better than Tom's...
m
4
l
a b U Graphics card
May 28, 2014 9:30:39 AM

harly2 said:
cleeve said:
harly2 said:
another Tom's article that gets different results from the rest of the tech community, and tends to have Nvidia leanings


Your comment is too general to help you with.

What do you mean by "Nvidia leaning", exactly?

What specific result are you talking about that the 'rest of the tech community' has produced conflicting data with?



Oh god you again....Other tech sites consistency get better results from AMD hardware, I'm not linking it for you, simply go to one of the multitude of other tech sites pick an AMD product they have reviewed look at the benchmarks and see how it compares to the toms reviews. Toms has a negative tone with AMD products and also misrepresents them on CPU and GPU charts. This is just another example.



Oh god you again? He is an editor here so basically you attacked him too, and he writes a ton of articles here. You didn't expect him to respond to you trashing their data? When you do that, you need to provide something, and saying go find it yourself is NOT sufficient.

https://en.wikipedia.org/wiki/Graham%27s_Hierarchy_of_D...
Land in the top 3, then you've done your job. Landing in the bottom 4 (as you've done) is a complete failure to make your point. Nice try though ;) 
m
5
l
May 28, 2014 10:00:51 AM

I'm kinda disappointed there are no benchmarks based on same CPU, different frequency, from both Intel and AMD, to see how the game scales.

Also talking both type of CPUs to see how the game properly scales with different number of cores.
m
1
l
a b 4 Gaming
a b U Graphics card
a b à CPUs
May 28, 2014 10:03:43 AM

Intel Xeon E5 v2 ranges from 8, 10, and 12 cores. It would be nice if Tom's can test the game with them. :D 
The E7 v2 goes up to 15 cores, but this uses a entirely different socket, unlike the E5, which uses the same socket 2011 as the Core i7 3960X
m
0
l
    • 1 / 6
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!