HEXiT :
its not a written in stone rule mate just a rule of thumb for budget builds
i have a few of em... this is gonna get a bit rantish but here we go anyway...
a lot of guys here wouldnt like what i say next because they go against the general consensus.
things like sli 780ti's is just braggin rites. you loose image quality to get max fps, if you do want image quality you have to turn vsync on to get rid of the tearing and microstutter. this will limit the gpu to 60 fps on a 60hz screen which pretty much negates the second 780ti... like i said some wont like this... but its a choice fps or image quality...
120hz i hear you cry... well crysis 3 and some other ultra demanding titles wont give 60 fps minimum which means when you stick in the second card you wont get 120fps which means micro stutter. turn vsync on to get rid of it and you bounce between 60 and 120 fps (yes adaptive vsync is on the way but you need a specific kind of monitor to use it but again its currently only working on 1080p 60hz)
another fact is most people dont overclock maybe only 1-2% of pc owners do. even so the gains are minimal for the most part when they do oc. crysis 3 will give maybe 3-5 fps extra on a cpu running at 700mhz over stock. thats a 20% increase in cpu speed for a 2-3% gain in fps. so the reality is overclocking is a waste of time and electricity for gaming... great if your doing stuff like 3ds max where you get a substantial gain but in games its minimal. the results are even more dramatic when you start overclocking the gpu... especially for power consumption to gains ratios.
as i see it, if you have a cpu that runs at 2.6ghz and is an intel quad core or 3.2ghz for an amd quad module (8core) you dont need to oc anything.
yes i can hear a lot of OH's and AH!'s but hey all they gotta do is test it themselves and they will see im not lying.
the reason some manufacturers shout OC OC OC. is to shorten the life of the cpus/gpu's so we buy more. the faster you run them the higher the average temps, the quicker they develop electron leakage which in turn means you have to keep increasing the volts over the life of the cpu/gpu to keep it stable till 1 day it goes off. preferably sooner than later in the manufacturers eyes.
like i said i know this is controversial but its the state of play as i see it.
you guys feel free to give a rebuttal. just keep it clean and where possible give your proof.
I'm inclined to agree with you. I expect top end owners can provide plenty of reasons to spend more on one graphics card than I would on an entire gaming PC - but we mid-rangers consider their gains so minimal that we consider it all as bragging rights which, lets face it, is fair play too! If I had a top notch card from a top maker I'd be showing it off. That's what they're for, really.
Your points about overclocking are very interesting and this is an angle I hadn't considered. Would love to hear anyone try to refute this. I don't really think they can with conviction. Ultimately overclocking is a legitimate way to increase performance but it must be cherry pie to the manufacturers because it can cause shorter lifespans. Although I have read that as long as it's done sensibly with proper cooling and ventilation it shouldn't shorten usage further than when you would normally consider upgrading anyway. That's just something I read though so shoot me down if this isn't true.
Haaanyway, food for thought innit. Think I shall start some new threads (only after searching for solutions of course)