Sign-in / Sign-up
Your question
Closed

Hardware Bottleneck is a Myth

Tags:
  • Graphics Cards
  • GPUs
  • Bottleneck
  • CPUs
  • Performance
  • Graphics
  • Product
Last response: in Graphics Cards
November 11, 2009 1:17:27 PM

CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU. There is no relationship between bottlenecking between a CPU and GPU. This is a Myth created by people who just repeat what they hear (sheeple).

Really when it comes down to it... it's all about [shitty] programming in the SOFTWARE.

Take Everquest 2 for example. Despite its intense graphics, it's primarily CPU-driven and ignores the GPU.
If you add a second graphics card (SLI/Crossfire) you will LOSE performance. This is a big deal in 2009.

Take SupCom for example. Despite that it's a wonderful game and fine in performance, the game's actual scope goes far beyond any CPU's capacity to handle all the calculations without any programming to divert some of the tasking to the very powerful and often unutilized gpu. This isn't so much of a big deal, but the developers could have put a limit on the game or increased performance to be at a realistic level at [x] amount of units/calculations.

When programming software, the developers of video games need to take performance into consideration. If you want to make a game that can go from small skirmish to epic scale- you need to program with such foresight into performance. If you want to make a video game at all, you need to utilize modern technology instead of ignoring it.

I say "Bottleneck is a Myth" to reiterate the simple fact that the problem is rarely the hardware, but the lack of foresight (or too much foresight to "future proof" a game) in the actual Software.

As anyone can see, there is software made with absolutely wonderful performance, and software made with horrid performance.
There are video games that run smoothly on crappy systems that look stunning, and video games that run horribly on even the best systems and still have shitty graphics.

Bottleneck is in the software, infinitely more than the hardware- so grab your pitchforks and torches and tell your gaming developers (actually...their producers and advertisers $$$$$$$$...) and get some better performance out of your games.

More about : hardware bottleneck myth

November 11, 2009 2:55:27 PM

nitros85 said:

I say "Bottleneck is a Myth" to reiterate the simple fact that the problem is rarely the hardware, but the lack of foresight (or too much foresight to "future proof" a game) in the actual Software.

Bottleneck is in the software, infinitely more than the hardware- so grab your pitchforks and torches and tell your gaming developers (actually...their producers and advertisers $$$$$$$$...) and get some better performance out of your games.


You only see what you want to see... not only do you contradict yourself but you do it in a way that will confuse people and makes you look like the one "just repeating what you hear in the internet". You do bring up a good point about shitty programming in some games but like you said, bottlenecks do happen wether its crappy programming or crappy hardware.

Hardware bottleneck is not a myth :non: 
Score
0
a b U Graphics card
November 11, 2009 3:12:28 PM

When you see several tiers of gpus all producing the same fps, explain to us what that means please
Score
0
Related resources
Can't find your answer ? Ask !
November 11, 2009 3:23:15 PM

I guarantee you that if you stick a gtx295 or a 5870 with an intel Pentium 4 from 5 years ago (or however long ago, just grasping for a time frame) it would perform on par with most cards from 2-3 years ago. All current games require a certain amount of cpu and gpu power.

If you cannot meet one of these but far exceed the other, it does not make up for your lack of power on the other front. Your gaming performance is only as good as your weakest link. Look at the recent article TH put out comparing the newer i7 cpus with the older Core2Duo's paired with a gtx295 or 4870x2 both cards performed worse than slower cards while paired with the slower cpu, only showing their power once paired with a top end cpu.

There are in fact hardware bottlenecks.

There is also shitty programming.

There is a difference and both do in fact exist.
Score
0
a b U Graphics card
November 11, 2009 3:25:12 PM

Hardware bottlenecking occurs, whether the OP realises it, whatever the weakest link in the system is, is the FIRST bottleneck.

Just because a Buggatti Veyron can go faster than almost any other production car doesn't mean it isn't being held back by either Aerodynamics, Engine design/power , transmission, tires, etc.

Since the CPU must FEED the GPU and readback from the GPU, the CPU very easily CAN bottleneck the rest of a system especially a GPU which is one of the few things that can keep up with or surpass a modern CPU's resources.
Score
0
November 11, 2009 3:30:47 PM

I hate it when people whine about poorly written software and have never written a line of code in their life.
Score
0
a b U Graphics card
a b à CPUs
November 11, 2009 3:31:11 PM

My life is now complete...

In other news, i had a Pentium D in a machine paired up with an ATI 4850, MAX settings turned out to be a slide show in World of Warcraft. (and this isnt even a demanding game)

2 weeks later, i throw in an E4500 and OC it to the SAME exact speed at i had the Pentium D running using ALL the same hardware except the pentium D i am now able to MAX settings completely in a 22-inch monitor and my FPS went up substantially. Splain that one to me please...
Score
0
a b U Graphics card
a b à CPUs
November 11, 2009 3:41:32 PM

Me, JayDee, and TGGA agree on something?

*hides*
Score
0
a b U Graphics card
November 11, 2009 3:42:02 PM

I hope the scope of these types of questions/statements dont increase as LRB enters the gfx market
Score
0
a b U Graphics card
a b à CPUs
November 11, 2009 3:44:33 PM

gamerk316 said:
Me, JayDee, and TGGA agree on something?

*hides*


Pretty sure EVERYONE agrees on this cept for Nitro85
Score
0
a b U Graphics card
November 11, 2009 3:57:55 PM

We all may yet be surprised. Spend some time at the cpu section, where the cpu can do no wrong heheh
Score
0
a b U Graphics card
November 11, 2009 4:12:20 PM

out of all the cards in that recent article , gtx 295 scales nearly linearly to better processors/cores , but whereas the 4870x2 not so much , and overall it seems that sli scales better then xfire but depends on a powerful cpu to unleash itself . the i7

as a personal thought i would prefer nvidia's better scaling to make full use of gpu horsepower rather than saving some dollars on the cpu . but it looks like single gpu cards (gtx 285/260) do not depend on multiple cores to give their best . with dual cores , there were many situations where gtx295=gtx285 !!! this is a hardware bottleneck and a disadvantage for those with dual cores .

also , it remains to be seen how they will actually use dx11's multi-threaded rendering , like how this will affect both single cards and sli/xfire and how well with more cores .

overall , you have to have quad cores for multiple gpus if you wanna make the most . or else get a single card with more shaders like the 4850 beats the 3870x2 .

these days , good budget builds = dual core + single gpu (not even x2's)
ftw build = quad core with multi gpu.
Score
0
a b U Graphics card
November 11, 2009 4:19:22 PM

You dont need hotter, louder, more power drawing tech nor SLI to get the same fps using a 295, just get a 5870
Score
0
a b U Graphics card
November 11, 2009 4:19:48 PM

coming to software , there are many reasons why a game is never optimized for YOUR system :-

1. they need to maintain compatibility on a variety of platforms . this alone is the biggest reason.
2.you could write the best quality code etc , test it and optimize it but there is a limit on budget vs time etc .
Score
0
a b U Graphics card
November 11, 2009 4:21:41 PM

I see you point, and somewhat agree, more cores are better for more cores, but more than that, faster clocks also, where we really see improvement, not that the cpu is the slow down and ocing it gives better fps sarc off
Score
0
a b U Graphics card
November 11, 2009 4:24:58 PM

or better , the fermi :D  but going from nvidia's history of :-

6800 ultra : i could be wrong , but>500 for some versions .
8800gtx : 600
280gtx : 650
fermi : let keep out fingers crossed and pray ! it will have all the goodness of single cards . but where the hell are leaked benchmarks :) 
Score
0
a b U Graphics card
November 11, 2009 4:27:56 PM

cyberkuberiah said:
out of all the cards in that recent article , gtx 295 scales nearly linearly to better processors/cores , but whereas the 4870x2 not so much , and overall it seems that sli scales better then xfire but depends on a powerful cpu to unleash itself . the i7


Uh, it's not really that cut and dry, are you only looking at the 1280x1024 charts? I think the one that shows the most potential with any of the configurations is the better performer, really. The GPU companies shouldn't back users into a corner with expensive components that they aren't making a profit off of. :) 

cyberkuberiah said:
as a personal thought i would prefer nvidia's better scaling to make full use of gpu horsepower rather than saving some dollars on the cpu . but it looks like single gpu cards (gtx 285/260) do not depend on multiple cores to give their best . with dual cores , there were many situations where gtx295=gtx285 !!! this is a hardware bottleneck and a disadvantage for those with dual cores.


For all intents are purposes with people who care about not spending thousands of dollars on a computer, the dual GPU solution isn't worth it when you can get roughly 10-15% less performance for 50% of the price. :/  It makes it even worse that even if you do spring for the money you may have less or on par performance with much less expensive cards!

cyberkuberiah said:
also , it remains to be seen how they will actually use dx11's multi-threaded rendering , like how this will affect both single cards and sli/xfire and how well with more cores.


Well, it has only been out in its final form like, a month? It'll probably increase speed on both fronts, but I wouldn't be suprised if it didn't particularly benefit dual PCB cards.

cyberkuberiah said:
overall , you have to have quad cores for multiple gpus if you wanna make the most . or else get a single card with more shaders like the 4850 beats the 3870x2.


Not to mention the change in architecture and generational differences... The HD 4850 is a lot newer, it's beating the 3850X2 simply because its advancements in that time frame, the 4850X2 was still better than the 4850; it's a bit unfair to compare the two. You're right it's necessary to have a more powerful CPU probably for dual PCB cards, but I don't think you're going to have as much of a problem (based on the articles) with ATi's solutions as the 4870X2 seems to hold its own better with a less expensive configuration than the GTX 295. Hopefully we can advance to a point where you won't have to buy the two most expensive things to get the most out of them some day, but we can only hope. :p 

cyberkuberiah said:
these days , good budget builds = dual core + single gpu (not even x2's)
ftw build = quad core with multi gpu.


I don't think a dual PCB GPU was ever in the budget builds. ;) 
Score
0
a b U Graphics card
November 11, 2009 4:36:29 PM

yes , single threaded performance always pays off , be it through oc'ing , or better instructions per clock (nehalem vs core 2) . and look at how lower clocked nehalem is comparable to amd phenom at higher clocks , example , i5 750 and phenom 965 "BE" .

i am looking forward to this kind of IPC improvement in sandy bridge which should hopefully succeed bloomfield , and i will sell my system (i dont even dream about the sandy bridge being compatible on x58 in spite of having it , considering intel compatibility history) .

looking at the past , when i built my 1366 in 2008 , i thought i was hurrying by not waiting for dropped prices , but they did not drop anyway , be it mobos or cpus except ddr3 which has gone down .
Score
0
a b U Graphics card
November 11, 2009 4:48:03 PM

Umm, its not dual pcb unless its nVidia, its dual cored
Score
0
a b U Graphics card
November 11, 2009 4:57:34 PM

Aww, I'm late to this thread... I always love explaining bottlenecks!

Anyway.. OP should stay in school.. Jay et al have the bases covered..

A bottleneck is a fundamental of system design. It is one of the first things you will ever learn about if you take any courses related to it, in architecture, computer science, engineering, physics... and so on.

Sure, software can be a bottleneck too (multi threading is a fine example of this) but you would have to know nothing about how a computer, or any complex system, works in order to think there could be no other limiting factors..

But to be fair.. I have to work with people that believe the LHC is being sabotaged by time travelers... not believing is bottlenecks is a ways down the "wow, where are you hiding the acid?" list. I think I'm gong to have to publish that list in fact... Now, should alien abductions be higher or lower than religion?
Score
0
a b U Graphics card
November 11, 2009 5:33:51 PM

@ brockh

look at far cry 2 in any resolution , this kind of scaling is utopian ! every dual card owner would like to see this . the i7 gives that ooze over even the 9550 , that's why the i5 750 with 110$ gigabyte ud2 is attractive . it would have been even more jazzy if the i5 750 was around 160 dollars .

but of course , x2's and cf's are never a part of budget builds anyway . one needs better psu , mobo etc etc etc :D  .

unrelated , but its scary that hd 4000 on newegg has shot up in prices . no more 100 dollar 4850's to be found . perhaps due to lack of supply of gt200 chips . and hd 5000 is way overpriced now for the performance it gives . nvidia what are you doing ?
Score
0
a c 280 U Graphics card
a c 360 à CPUs
November 11, 2009 6:13:27 PM

Two bottlenecking truths:

1) ANY part can bottleneck a system, and depending on what you are doing on the same computer it can be either the: hard drive, CPU, Graphics Card, Memory or Bus/Controllers.

2) BALANCING is the key to building a computer.
Score
0
a b U Graphics card
November 11, 2009 6:42:30 PM

The 5870 is overpriced? Compared to what?
Score
0
a c 131 U Graphics card
a b à CPUs
November 11, 2009 6:55:37 PM

Come on JDJ keep up everything is overpriced and should be reduced :D 

To the OP [:mousemonkey:5]

Mactronix
Score
0
a b U Graphics card
November 12, 2009 6:12:39 PM

1. its not ati's fault but a supply problem , ati would like to sell more numbers for a lower price for high total revenue , rather than few 400$ chips . i would not recommend patience to anyone rather than paying this amount .

2.another indirect cause is lack of competitive threats . remember how the gtx 280 was 649$ at launch , and people who got it were not too happy when it dropped just a few months later , or were they ? consider the pre-athlon days for cpu's . that being said , the i5 750 , while being a solid value proposition , has hyper threading disabled just for coaxing people to get i7 , not because of a tech limitation or something like that .

3.god alone knows how and when these problems would get sorted out , like supply for ati , and how fermi while being truly innovative , would mean to gamers .

again , looks like the christmas/new year holiday season which is just a month or so away from now , may not be enough for all this to sort out .
Score
0
a b U Graphics card
November 12, 2009 6:40:41 PM

End of the month
Score
0
November 19, 2009 12:41:55 AM

Epic fail on everyone's posts.

Also LOL @ "17 pages about cpu bottleneck" that is actually about building a system, with only a few pages dedicated to CPU's which also don't show bottlenecks at all.
Score
0
November 19, 2009 12:44:49 AM

Also, you all epic fail at reading comprehension, blinded by your own nerd rage to look past a "stupid post" and actually use your brain.

I even admitted that a CPU can bottleneck a game's performance.
Unfortunately, none of you even read THE FIRST SENTENCE of my post.
Instead, you read the title and attack the thread.
Rather unintelligently... seeing as how I never once said the CPU cannot bottleneck a game's performance- and IN FACT I specifically said it CAN.


CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU.

CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU.

CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU.

CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU.

CPU's never bottleneck a GPU. Sure, either CPU/GPU can bottleneck a game's performance, but the CPU does NOT bottleneck a GPU.





Maybe if I post it multiple times, you will finally understand.

The ignorance and disabled mental capacity of Tom's Hardware's community fails to surprise me.

All you guys do is reiterate what you hear other ignorant people state.

"Upgrade to the i7 bc the core2duo will bottleneck your GPU!!!!!111"
Score
0
a b U Graphics card
November 19, 2009 1:30:22 AM

Youre wrong, need some more years behind you, and yes, a cpu can bottleneck a gpus functions
Score
0
November 19, 2009 1:38:18 AM

It never does...

I'm glad moderators are so low-brow that they stoop to subtle insults "need some more years behind you" -_-

Sorry child, but a fancy 'Moderator' title in italic blue doesn't equate you to dispense misinformation with the assumption that the OP is immature.

It's a good reflection of tomshardware though. The moderators are as ignorant as the rest of the community.

Monkey see, Monkey do. It works.

AnAndTech > Tomshardware.

Always has been. Always will be.

Poor Tom, he's like the blue collar website in a white collar world :( 
Score
0
November 19, 2009 1:43:36 AM

JAYDEEJOHN said:
Youre wrong, need some more years behind you, and yes, a cpu can bottleneck a gpus functions


Support evidence to show that a CPU can bottleneck a GPU.

Oh wait, you can't. Why? Because they don't.

CPU's can bottleneck a game's performance, causing it to lag.
CPU's [can bottleneck an entire system if it's that crappy.

But CPU's CANT bottleneck a GPU.


Also, I don't generalize like you. In some twisted unrealistic setup- yes a CPU could bottleneck a gpu's functions. Also, you forgot that "gpu's" is possessive...

If you want to somehow program gpu drivers to specifically bottleneck a cpu, then by all means go ahead! It's certainly possible. But in reality, the CPU can't bottleneck the GPU. If you're gonna talk about bottleneck, at least use the right terms. Say "The CPU can bottleneck gaming performance." A game's performance isn't the GPU, it's the... game's performance...

You need some more years behind you, and a lot of research to educate yourself away from the ignorant myth that somehow the CPU bottlenecks the GPU, rather than the entire system.

You might as well be saying "The CPU bottlnecks the DVD drive, causing low FPS!" because that would be as true as what you're saying. In fact, I think your bottleneck is not your CPU or GPU- but your brain and inability to process information.

Sorry :( 
No upgrade for that!
Score
0
November 19, 2009 1:47:39 AM

nitros85 said:
Hardware Bottleneck is a Myth


Very true. Pentium 4 HT / 2MB Cahce + 2GB DDR2 + 9800GT = CoD4 @ 1280x1024 on Medium 4x AA = 60FPS.

^^ No ***.
Score
0
a c 221 U Graphics card
a c 257 à CPUs
November 19, 2009 1:59:50 AM

http://www.tomshardware.com/reviews/radeon-hd-5970,2474...
Left 4 Dead at 1680x1050, best example of a CPU bottlenecking a GPU that i can find.

Now lets define bottleneck, The American Heritage® Dictionary of the English Language, Fourth Edition defines a bottleneck as:
A hindrance to progress or production

In this case it would be anything that hinders the production of frames that can be displayed on the screen. The CPU is the only thing that feeds the GPU the information that it needs to produce the frames, if the CPU is unable to do this then it becomes a bottleneck to the GPU, not to the whole system as it is not hindering data being transfer to or from the RAM or the HDD.

As the GPU is complete in charge of rendering the frames that can be displayed on screen, anything that reduces the number of frames that the GPU can produce aside from the GPU itself is therefore bottlenecking the GPU.

The games performance however stands bottleneck free as nothing is bottlenecking the CPU from running the game as fast as it possibly can aside from itself and a component cannot bottleneck itself.
Score
0
a b U Graphics card
November 19, 2009 2:00:21 AM

Umm, I wish you could call me son, but besides that, you havnt a clue, and insulting me wont get you far here, maybe elsewhere, but not here.
Get my hint?
This isnt the first time with you, and m patience is closing to an end.
You need to grow up, dont repeat something 30 times like a little kid badgering his parents by continually asking over and over.
Your actions prove what and who you are.
Now shape up or ship out, and if you dont know what that means, I can tell you and explain it
Score
0
a b U Graphics card
November 19, 2009 3:28:39 AM

nitros85 said:
Also, you all epic fail at reading comprehension,


What YOU lack is reading comprehension, and even worse, you lack a basic understanding of how a GPU works. :pfff: 

You argument doesn't even hold true since like I first pointed out to you a GPU must be fed by a CPU regardless of what task it's doing, so it can be bottlenecked by a CPU in games, 3Dmodeling, GPGPU tasks, and petty much anything you can throw at it where the CPU cannot keep up to the requirements of the GPU if either it needs to be fed by or output to the CPU.

End of story. [:thegreatgrapeape:7]
Score
0