Hardware Bottleneck is a Myth

Status
Not open for further replies.

nitros85

Distinguished
Dec 10, 2008
90
0
18,630
CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU. There is no relationship between bottlenecking between a CPU and GPU. This is a Myth created by people who just repeat what they hear (sheeple).

Really when it comes down to it... it's all about [shitty] programming in the SOFTWARE.

Take Everquest 2 for example. Despite its intense graphics, it's primarily CPU-driven and ignores the GPU.
If you add a second graphics card (SLI/Crossfire) you will LOSE performance. This is a big deal in 2009.

Take SupCom for example. Despite that it's a wonderful game and fine in performance, the game's actual scope goes far beyond any CPU's capacity to handle all the calculations without any programming to divert some of the tasking to the very powerful and often unutilized gpu. This isn't so much of a big deal, but the developers could have put a limit on the game or increased performance to be at a realistic level at [x] amount of units/calculations.

When programming software, the developers of video games need to take performance into consideration. If you want to make a game that can go from small skirmish to epic scale- you need to program with such foresight into performance. If you want to make a video game at all, you need to utilize modern technology instead of ignoring it.

I say "Bottleneck is a Myth" to reiterate the simple fact that the problem is rarely the hardware, but the lack of foresight (or too much foresight to "future proof" a game) in the actual Software.

As anyone can see, there is software made with absolutely wonderful performance, and software made with horrid performance.
There are video games that run smoothly on crappy systems that look stunning, and video games that run horribly on even the best systems and still have shitty graphics.

Bottleneck is in the software, infinitely more than the hardware- so grab your pitchforks and torches and tell your gaming developers (actually...their producers and advertisers $$$$$$$$...) and get some better performance out of your games.
 

MiamiU

Distinguished
Jul 8, 2009
58
0
18,630


You only see what you want to see... not only do you contradict yourself but you do it in a way that will confuse people and makes you look like the one "just repeating what you hear in the internet". You do bring up a good point about shitty programming in some games but like you said, bottlenecks do happen wether its crappy programming or crappy hardware.

Hardware bottleneck is not a myth :non:
 

crosko42

Distinguished
Sep 20, 2009
121
0
18,690
I guarantee you that if you stick a gtx295 or a 5870 with an intel Pentium 4 from 5 years ago (or however long ago, just grasping for a time frame) it would perform on par with most cards from 2-3 years ago. All current games require a certain amount of cpu and gpu power.

If you cannot meet one of these but far exceed the other, it does not make up for your lack of power on the other front. Your gaming performance is only as good as your weakest link. Look at the recent article TH put out comparing the newer i7 cpus with the older Core2Duo's paired with a gtx295 or 4870x2 both cards performed worse than slower cards while paired with the slower cpu, only showing their power once paired with a top end cpu.

There are in fact hardware bottlenecks.

There is also shitty programming.

There is a difference and both do in fact exist.
 
Hardware bottlenecking occurs, whether the OP realises it, whatever the weakest link in the system is, is the FIRST bottleneck.

Just because a Buggatti Veyron can go faster than almost any other production car doesn't mean it isn't being held back by either Aerodynamics, Engine design/power , transmission, tires, etc.

Since the CPU must FEED the GPU and readback from the GPU, the CPU very easily CAN bottleneck the rest of a system especially a GPU which is one of the few things that can keep up with or surpass a modern CPU's resources.
 

jonpaul37

Distinguished
May 29, 2008
2,481
0
19,960
My life is now complete...

In other news, i had a Pentium D in a machine paired up with an ATI 4850, MAX settings turned out to be a slide show in World of Warcraft. (and this isnt even a demanding game)

2 weeks later, i throw in an E4500 and OC it to the SAME exact speed at i had the Pentium D running using ALL the same hardware except the pentium D i am now able to MAX settings completely in a 22-inch monitor and my FPS went up substantially. Splain that one to me please...
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010
out of all the cards in that recent article , gtx 295 scales nearly linearly to better processors/cores , but whereas the 4870x2 not so much , and overall it seems that sli scales better then xfire but depends on a powerful cpu to unleash itself . the i7

as a personal thought i would prefer nvidia's better scaling to make full use of gpu horsepower rather than saving some dollars on the cpu . but it looks like single gpu cards (gtx 285/260) do not depend on multiple cores to give their best . with dual cores , there were many situations where gtx295=gtx285 !!! this is a hardware bottleneck and a disadvantage for those with dual cores .

also , it remains to be seen how they will actually use dx11's multi-threaded rendering , like how this will affect both single cards and sli/xfire and how well with more cores .

overall , you have to have quad cores for multiple gpus if you wanna make the most . or else get a single card with more shaders like the 4850 beats the 3870x2 .

these days , good budget builds = dual core + single gpu (not even x2's)
ftw build = quad core with multi gpu.
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010
coming to software , there are many reasons why a game is never optimized for YOUR system :-

1. they need to maintain compatibility on a variety of platforms . this alone is the biggest reason.
2.you could write the best quality code etc , test it and optimize it but there is a limit on budget vs time etc .
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010
or better , the fermi :D but going from nvidia's history of :-

6800 ultra : i could be wrong , but>500 for some versions .
8800gtx : 600
280gtx : 650
fermi : let keep out fingers crossed and pray ! it will have all the goodness of single cards . but where the hell are leaked benchmarks :)
 

brockh

Distinguished
Oct 5, 2007
513
0
19,010


Uh, it's not really that cut and dry, are you only looking at the 1280x1024 charts? I think the one that shows the most potential with any of the configurations is the better performer, really. The GPU companies shouldn't back users into a corner with expensive components that they aren't making a profit off of. :)



For all intents are purposes with people who care about not spending thousands of dollars on a computer, the dual GPU solution isn't worth it when you can get roughly 10-15% less performance for 50% of the price. :/ It makes it even worse that even if you do spring for the money you may have less or on par performance with much less expensive cards!



Well, it has only been out in its final form like, a month? It'll probably increase speed on both fronts, but I wouldn't be suprised if it didn't particularly benefit dual PCB cards.



Not to mention the change in architecture and generational differences... The HD 4850 is a lot newer, it's beating the 3850X2 simply because its advancements in that time frame, the 4850X2 was still better than the 4850; it's a bit unfair to compare the two. You're right it's necessary to have a more powerful CPU probably for dual PCB cards, but I don't think you're going to have as much of a problem (based on the articles) with ATi's solutions as the 4870X2 seems to hold its own better with a less expensive configuration than the GTX 295. Hopefully we can advance to a point where you won't have to buy the two most expensive things to get the most out of them some day, but we can only hope. :p



I don't think a dual PCB GPU was ever in the budget builds. ;)
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010
yes , single threaded performance always pays off , be it through oc'ing , or better instructions per clock (nehalem vs core 2) . and look at how lower clocked nehalem is comparable to amd phenom at higher clocks , example , i5 750 and phenom 965 "BE" .

i am looking forward to this kind of IPC improvement in sandy bridge which should hopefully succeed bloomfield , and i will sell my system (i dont even dream about the sandy bridge being compatible on x58 in spite of having it , considering intel compatibility history) .

looking at the past , when i built my 1366 in 2008 , i thought i was hurrying by not waiting for dropped prices , but they did not drop anyway , be it mobos or cpus except ddr3 which has gone down .
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
Aww, I'm late to this thread... I always love explaining bottlenecks!

Anyway.. OP should stay in school.. Jay et al have the bases covered..

A bottleneck is a fundamental of system design. It is one of the first things you will ever learn about if you take any courses related to it, in architecture, computer science, engineering, physics... and so on.

Sure, software can be a bottleneck too (multi threading is a fine example of this) but you would have to know nothing about how a computer, or any complex system, works in order to think there could be no other limiting factors..

But to be fair.. I have to work with people that believe the LHC is being sabotaged by time travelers... not believing is bottlenecks is a ways down the "wow, where are you hiding the acid?" list. I think I'm gong to have to publish that list in fact... Now, should alien abductions be higher or lower than religion?
 

cyberkuberiah

Distinguished
May 5, 2009
812
0
19,010
@ brockh

look at far cry 2 in any resolution , this kind of scaling is utopian ! every dual card owner would like to see this . the i7 gives that ooze over even the 9550 , that's why the i5 750 with 110$ gigabyte ud2 is attractive . it would have been even more jazzy if the i5 750 was around 160 dollars .

but of course , x2's and cf's are never a part of budget builds anyway . one needs better psu , mobo etc etc etc :D .

unrelated , but its scary that hd 4000 on newegg has shot up in prices . no more 100 dollar 4850's to be found . perhaps due to lack of supply of gt200 chips . and hd 5000 is way overpriced now for the performance it gives . nvidia what are you doing ?
 
Status
Not open for further replies.

TRENDING THREADS