FX 6300 to i7 for watch dogs

SycoSight

Honorable
Nov 14, 2013
228
0
10,690
I have been getting fps average of 45ish (60 in some areas) with my gtx 770 2gb and stock fx 6300. I run everything on ultra (including textures even though it is recommended for 3gb and higher cards) except for depth of field (high). But I get some SERIOUS fps drops sometimes and im pretty sure its not from using ultra textures. I realize that many ppl are having problems with watch dogs on pc right now but can I expect a noticeable performance boost once I swap out my cpu for the i7 4770k? (Shoulda gone with i5 and gotten a 780 but hindsight is 20/20 -_-). Thanks!
 

Deus Gladiorum

Distinguished


Ahaha, we have the exact same setup it seems, though I don't have Watch Dogs and I've OC'ed my FX-6300 to 4.5 GHz. It's good to get an idea of where my rig will be outputting once I start playing the game. Anyway, as for your fps drops there could be one of two possibilities:

A) It's the CPU. Large render distances destroy an FX-6300 -- even my OC'ed FX-6300 will never get a consistent 60 fps in any game that's large or open and explorable. If you switch to an i7 (though realistically an i5 will do you just the same as an i7 in gaming for $100 cheaper) then you should get much closer to 60 and should experience much more consistent fps, even though this may not solve your serious fps drops if the reason for the drops is possibility B.

B) It's the VRAM. This is much more likely the issue. From the benchmarks I've seen, ultra textures seriously do go way above 2 GB of VRAM, so that's quite likely the issue. If you'd like to confirm, go download MSI Afterburner and in the built in OSD turn on memory usage and check to see where it is. Alternatively, just turn textures to the minimum setting and see if the fps drops still continue.
 

SycoSight

Honorable
Nov 14, 2013
228
0
10,690


Ima do the msi afterburner thing while in game. I saw on the nvidia website that it is possible to play ultra on a 2gb card but it is recommened to use features that dont take up a ton of vram like smaa or fxaa. It also recommended using an ssd to boot up the game (i guess for faster load times?)
 

frillybob101

Honorable
Apr 20, 2013
405
0
10,860
I also have an fx 6300 at 4.3 GHZ. If there is a lot of stuff going on fps tanks. However if there are lots of graphicaly intensive stuff going on in a small area constant 60 fps it is.

I'm going to microcenter tomorrow to pick up an i5 4670k and a new motherboard. Right now at microcenter until June 1st the i5 4670k is $189.99!
 

SycoSight

Honorable
Nov 14, 2013
228
0
10,690


awesome! i shoulda gone with the i5 but ooooo well
 
The SSD is recommended to minimize in-game texture loading from the drive to Video memory for people choosing Ultra settings who don't have enough VRAM (3GB recommended).

As said, the game is heavy on the CPU, and VRAM.

I don't recommend running ULTRA with 2GB. The lack of Video RAM alone is going to cause major stutter due to texture loading from your drive. It's lessened with an SSD but still there.

NVidia has a guide:
http://www.geforce.com/whats-new/guides/watch-dogs-graphics-performance-and-tweaking-guide

People always seem to choose ULTRA settings regardless of how much a game may dip. I prefer to tweak to the SMOOTHEST experience. For me, I'd force on ADAPTIVE VSYNC then use roughly HIGH (not Ultra) and optimize so that I stay at 60FPS about 90% of the time with 10% dips below.
 

Deus Gladiorum

Distinguished


Yea, I assume the primary reason for the recommendation being an SSD is faster load times considering just how large these textures are (after all, they allegedly take up 3 GB -- something that not even my modded Skyrim can come close to). Please do the MSI Afterburner thing and let me know what it's like. Also, make sure to turn on other things in the OSD such as frame rate, and perhaps clock speed and temperature.

By the way, I totally feel you on the whole hindsight thing. When you spend $400 on a GPU, you expect your frame rates to be 60 fps all the time. Unfortunately, I was nowhere near as good with computers and up-to-date with the latest in parts a year ago when I made my purchase as I am now, and I was highly disappointed to find out that only half my games could achieve 60 fps consistently on ultra settings while the rest jumped back and forth between 30 and 60. That being said, these "serious fps drops" could be more of your imagination than you might think, especially if you haven't been using a frame counter to monitor them. I don't doubt that your frame drops do exist, but if the problem isn't related to your VRAM then you might just be dropping to 30 fps so much that it conflicts with the smoothness you feel when you get 45 or 60 fps in game. At least, that's what used to happen to me. 30 fps used to feel unbearable to me, but once I just told myself to deal with it, I was hardly bothered by the drops. Of course, if these are drops to well below 30 and they happen really often, then that's a whole other issue.
 

Deus Gladiorum

Distinguished


That's strange. From google results, it doesn't seem to be one of those games that MSI Afterburner's OSD is incompatible with. I guess the best option then is to just keep MSI Afterburner open as you play the game and then alt-tab to it once you see a frame drop. Then just look at the graph for memory and you'll have your answer.
 

SycoSight

Honorable
Nov 14, 2013
228
0
10,690


Ill try that
 

SycoSight

Honorable
Nov 14, 2013
228
0
10,690
so far the memory usage has gone up to a tad over 2000 like 2017 i think but it tends to stick to 1950... Not sure if this means the gpu cant handle ultra or not :/... When I get my ssd and i7 and when a patch comes out, ill find out if the 2gb of vram is the real bottleneck with ultra textures.