GPU- related bottlenecking

ChillaxedUpgrader

Honorable
Nov 13, 2013
133
0
10,690
Hi all.

If my CPU is running games using a relatively more powerful GPU...

1) Does it cause the CPU to run at high capacity?
2) Does it increase wear on the CPU?

Thanks in advance for any help with this.
 
Solution
running the cpu at max speed @stock timings wont cause any additional wear and tear so wont shorten its life the same way over clocking with high voltages does.
as got the gpu it really depends if its massivley overpowered or reasonably powered but displaying on an undersized screen it will cause some games to choke up the cpu you could for instance run starcraft 2 with a core i3 dualcore with a gtx780, max out the game and get 200 fps no problem, but try and run BF4 on the same setup the cpu would choke and throttle the gpu to under 50% resulting in fps bouncing from 5 to 50 fps.

building a balanced system is essential if your a gamer. and a simple rule of thumb for a budget build is pay x2 for the gpu you payed for the cpu max. but...

Supahos

Expert
Ambassador
This wont hurt anything, parts are made to spend large chunks of their life at 100% usage it will be fine, just make sure temps are in good shape. If the bottleneck is the GPU it would be more liklely to hurt the GPU than the cpu btw... (neither will be harmed if temps are fine)
 
running the cpu at max speed @stock timings wont cause any additional wear and tear so wont shorten its life the same way over clocking with high voltages does.
as got the gpu it really depends if its massivley overpowered or reasonably powered but displaying on an undersized screen it will cause some games to choke up the cpu you could for instance run starcraft 2 with a core i3 dualcore with a gtx780, max out the game and get 200 fps no problem, but try and run BF4 on the same setup the cpu would choke and throttle the gpu to under 50% resulting in fps bouncing from 5 to 50 fps.

building a balanced system is essential if your a gamer. and a simple rule of thumb for a budget build is pay x2 for the gpu you payed for the cpu max. but the more you spend on the cpu the higher you can go on the gpu.
 
Solution

ChillaxedUpgrader

Honorable
Nov 13, 2013
133
0
10,690


Thanks Hexit, Good info. The whole thing with "GPU should be maximum x% more than the CPU" is interesting. I've seen some people say 150%, others 200%. Thing is, with AMD processors being so much cheaper than comparable Intel ones... I'm not sure how this can apply. Unless Intel processors are inherently better at working with GPUs, regardless of their power... Hmmmm. Think I might start a new thread on this, but what are your thoughts on this Hexit?

Thanks again. D
 

ChillaxedUpgrader

Honorable
Nov 13, 2013
133
0
10,690


Hey Supahos, thanks for this info. Much appreciated.
 
its not a written in stone rule mate just a rule of thumb for budget builds
i have a few of em... this is gonna get a bit rantish but here we go anyway...

a lot of guys here wouldnt like what i say next because they go against the general consensus.

things like sli 780ti's is just braggin rites. you loose image quality to get max fps, if you do want image quality you have to turn vsync on to get rid of the tearing and microstutter. this will limit the gpu to 60 fps on a 60hz screen which pretty much negates the second 780ti... like i said some wont like this... but its a choice fps or image quality...

120hz i hear you cry... well crysis 3 and some other ultra demanding titles wont give 60 fps minimum which means when you stick in the second card you wont get 120fps which means micro stutter. turn vsync on to get rid of it and you bounce between 60 and 120 fps (yes adaptive vsync is on the way but you need a specific kind of monitor to use it but again its currently only working on 1080p 60hz)

another fact is most people dont overclock maybe only 1-2% of pc owners do. even so the gains are minimal for the most part when they do oc. crysis 3 will give maybe 3-5 fps extra on a cpu running at 700mhz over stock. thats a 20% increase in cpu speed for a 2-3% gain in fps. so the reality is overclocking is a waste of time and electricity for gaming... great if your doing stuff like 3ds max where you get a substantial gain but in games its minimal. the results are even more dramatic when you start overclocking the gpu... especially for power consumption to gains ratios.

as i see it, if you have a cpu that runs at 2.6ghz and is an intel quad core or 3.2ghz for an amd quad module (8core) you dont need to oc anything.
yes i can hear a lot of OH's and AH!'s but hey all they gotta do is test it themselves and they will see im not lying.

the reason some manufacturers shout OC OC OC. is to shorten the life of the cpus/gpu's so we buy more. the faster you run them the higher the average temps, the quicker they develop electron leakage which in turn means you have to keep increasing the volts over the life of the cpu/gpu to keep it stable till 1 day it goes off. preferably sooner than later in the manufacturers eyes.

like i said i know this is controversial but its the state of play as i see it.

you guys feel free to give a rebuttal. just keep it clean and where possible give your proof.
 

ChillaxedUpgrader

Honorable
Nov 13, 2013
133
0
10,690


I'm inclined to agree with you. I expect top end owners can provide plenty of reasons to spend more on one graphics card than I would on an entire gaming PC - but we mid-rangers consider their gains so minimal that we consider it all as bragging rights which, lets face it, is fair play too! If I had a top notch card from a top maker I'd be showing it off. That's what they're for, really.

Your points about overclocking are very interesting and this is an angle I hadn't considered. Would love to hear anyone try to refute this. I don't really think they can with conviction. Ultimately overclocking is a legitimate way to increase performance but it must be cherry pie to the manufacturers because it can cause shorter lifespans. Although I have read that as long as it's done sensibly with proper cooling and ventilation it shouldn't shorten usage further than when you would normally consider upgrading anyway. That's just something I read though so shoot me down if this isn't true.

Haaanyway, food for thought innit. Think I shall start some new threads (only after searching for solutions of course)