Why do we accept the garbage that gpu makers spew?

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
With AMD for many years and now Conroe, cpus are quiet, small, effiencient, don't use alot of power, don't get hot, and are very powerful.

Gpus on the other hand keep getting bigger, are loud, run hot, are NOT efficient when you compare them to cpus...

If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it. I'm not talking a dual core or a a pc with two chips, a pc with a cpu and memory...

My allendale runs at 41c OVERCLOCKED. my motherboard runs at 31c. My allendale is FASTER then my x1900xt 512 and yet the damn thing runs at 60c. And now we hear that the new ones will be twice as hot..
 

trinitron64

Distinguished
Jun 25, 2006
302
0
18,780
Perhaps the polite thing to do would be for the gaming industry and the GPU manufactures to tone it all down... because you want a cool and quiet gpu.

What if the demands of the CPUs today, were astronomical? Don't you think that CPU makers would be forced to run their products hotter to squeeze as much performance out... leading HSF producers to create larger, louder and more powerful products to cool them?

I think what you demand is unreasonable.

GPUs ARE cool and quiet my friend, but only in hindsight. By that I mean sit on your hands... in two years buy a video card and under clock the hell out of it and play a game that today stresses GPUs. Your hindsight rig will be cool and quiet, I assure you.

:roll:
 

bigspin

Distinguished
Sep 16, 2006
97
0
18,630
You call new GPUs are not so damn hot ha!!!!!
My old mid range X1600 goes up to 55C.. But my P D 830 running max 52C when i run prime95+ BF2 with stock cooler.So how can i agree with you brother.
 

shadowduck

Distinguished
Jan 24, 2006
2,641
0
20,790
Simple answer:

Users have decided that eye candy sales games. Screw the actual content of the game, if it looks damn look on high settings (bragging rights too), people buy it.

So, what do you GPU makers do? Produce cool and quiet products that cannot play the games the way people want them? Or produce cool and quiet GPUs that cost $1000 due to all the engineering?

Obviously- since neither of those options are viable, the produce GPUs that run hot with lots of fans. It is all about the bottom line. There is no reason to change with game companies pay you to design your GPU to work better in their games, and users snap it up and fall for such marketing crap as QuadSLI.
 

Sean618

Distinguished
Aug 22, 2006
109
0
18,680
It's fairly obvious that you don't have any idea when it comes down to the actual electronics and workings behind the components in the computer and neither do I and neither do most of the people on this forum. You can only make those comments if you actually understand how they work.

If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it.
Where the hell do you get this ****. I very rarely swear, but seriously you have NO facts. I mean I agree to the outsider it looks as if the CPU companies are much better but we know nothing about it. Don't you think that if two major companies are fighting all out to make the fastest graphics card they would develope the best they can with the technology available, I mean if they could just use a CPU and just redesign it a bit wouldn't they?

Maybe the GPU companies should make the ram as they are already using GDDR 4 which I sure you can 'guarantee' is faster.

This **** goes on all the time, I hear people saying how the vets ripped them off, yeah well my parents are vets and they get a **** salary for someone who has to go through 7 years at university, has to work 8 til 7 and has to be on duty every other day incase of an emergency. They have one of the highest suicide rates and stress level. Don't complain about what you don't know!
 

megame255

Distinguished
Jun 24, 2006
264
0
18,780
This **** goes on all the time, I hear people saying how the vets ripped them off, yeah well my parents are vets and they get a **** salary for someone who has to go through 7 years at university, has to work 8 til 7 and has to be on duty every other day incase of an emergency. They have one of the highest suicide rates and stress level. Don't complain about what you don't know!

Wow, nice analogy. I'm pretty sure the complaint that started this thread was about gpu manufacturers releasing hot, noisy products, not about how many hundreds of $ the vet charges for short, simple procedures.
In any case, if it's too much work for the small reward, don't become a veterinarian and if you find video cards too hot and noisy you can always buy cheaper, cooler, and quieter - or not at all.
 

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
It's not really about them being hot and noisy. It's the fact that they are a combination of all that while cpus are much much more efficient.
 

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...
 

Sagekilla

Distinguished
Sep 11, 2006
178
0
18,680
Just my 2 cents on the situation, I gotta run quick so I didn't have time to read much but heres my answer: a dual-core CPU from AMD runs at 2.4 GHz, with TWO cores. The rough equivelant from a Graphics card is a high end video card with 24 pipes running at 500 MHz. Bad comparison, but the GPU is optimized for video games. If you had your CPU alone rendering all those scenes, I doubt you'd get 1/10th the fps you would with the help of a video card.

I know how bad of analogy it is to compare an entire CPU core to a single pipeline in a GPU, but look at a 500 MHz GPU with 2 pipelines. They consume like 10 Watts tops and can use passive heatsinks. Don't forget that the 'stock' HSF on your CPU is probably bigger then the one on your video card!! I know mine is.

Go ahead and flame me all if you want if you think my views are a bit skewed (Which I think they might be..) But I'm off to go play golf. Have fun all.
 
The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...

I don't know why you think that CPU's are 'faster' or 'more powerful or 'more efficient' than GPU's.

GPUs do a very specific task, they do it very quickly in parallel i.e. doing lots of it at the same time, CPU's have only just gotten (past 2 years or so) to doing two things at the same time. If you want a comparison of power levels of a GPU compared to a CPU think about this.

ATI (definately) and Nvidia (I believe) are planning to use spare GPU processing power to do physics calculations in response to ageia. These are doing calculations that the CPU can't do quickly enough which created the need for Ageia in the first place. Yes at the moment the Ageia drivers are not very good, but they are improving and CPU on its own can only run the demo's without eye-candy turned on.

If you are referring to the fact that GPU's are hotter and need more power, then yes they do, but having so much to do in parallel, will cause a lot of heat to be generated. Also with a GPU most of data is changing every 1/60th of a second, whereas for a CPU running a couple of Apps most of the data is constant, even moving data to and from memory creates heat.

You can run most things with a low end card, I'm running XP with an old Ti4800 and its very happy. But... to play any games you will need significant amounts of processing power, hence the arms race between Nvidia and ATi to provide the greatest Video processing power on a card. So the fact that people want to play games is forcing the high end GPU to exist.

You chose to buy a 1900xt, it wasn't needed in order to run the PC, or infact to run most games, most of them are perfectly happy on much older hardware. I'm running BF2 and Oblivion at 1280x1024 on a 6600GT and they look fine to me.
 

dverduzco10

Distinguished
Sep 10, 2006
17
0
18,510
The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...

I don't know why you think that CPU's are 'faster' or 'more powerful or 'more efficient' than GPU's.

GPUs do a very specific task, they do it very quickly in parallel i.e. doing lots of it at the same time, CPU's have only just gotten (past 2 years or so) to doing two things at the same time. If you want a comparison of power levels of a GPU compared to a CPU think about this.

ATI (definately) and Nvidia (I believe) are planning to use spare GPU processing power to do physics calculations in response to ageia. These are doing calculations that the CPU can't do quickly enough which created the need for Ageia in the first place. Yes at the moment the Ageia drivers are not very good, but they are improving and CPU on its own can only run the demo's without eye-candy turned on.

If you are referring to the fact that GPU's are hotter and need more power, then yes they do, but having so much to do in parallel, will cause a lot of heat to be generated. Also with a GPU most of data is changing every 1/60th of a second, whereas for a CPU running a couple of Apps most of the data is constant, even moving data to and from memory creates heat.

You can run most things with a low end card, I'm running XP with an old Ti4800 and its very happy. But... to play any games you will need significant amounts of processing power, hence the arms race between Nvidia and ATi to provide the greatest Video processing power on a card. So the fact that people want to play games is forcing the high end GPU to exist.

You chose to buy a 1900xt, it wasn't needed in order to run the PC, or infact to run most games, most of them are perfectly happy on much older hardware. I'm running BF2 and Oblivion at 1280x1024 on a 6600GT and they look fine to me.

I couldn't agree more in regards to the arms race between Nvidia & ATi. You can run a number of games at 1280 x1024 with a card that is a year or two old, but there are people who want more, faster, brighter, crisper.

Years ago when 3dfx was fiddling with the idea of having a seperate power supply for their top end graphics card people thought they were insane. Now with the frame rates, and high resolutions that are available it almost sounds reasonable.

It all boils down to whether or not the public will buy the next generation of high Temperature/Power Consumption GPUs....my guess is they will.
 
What the heck????? By vet, do you mean veterinarian????

Holy cow, this thread is weird.

Yes a very odd thread, I assume that you are not talking to me about the Vet's though. I have always found it odd that vets have a greater training period than doctors. I suppose that they work on more species though, and the species can't talk (mostly) and all diagnosies have to be inferred from very hidden symptoms.
 

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
It isn't because gaming is not what a cpu is intended to be.

The gpu does all gaming tasks. it's always better to have seperate processign units doing seperate things.

However, thats not to say that the gpu is in any way efficient.Not only does it sometimes require 2-4 gpus, they are always lcoked lower then cpus, run hotter then, and have higher ram then cpus...

Yet they run things worse then a cpu does.

Yes, a completely unfair advantage, a cpu that is running 2d screens versus a gpu that runs while the cpu helps it. However, all that is required of a cpu is a large amount of cooling on top of it. For a gpu, it must be boosted up with extensive cooling, better ram, 2 or more gpus, and then?

One other thing; about the 2d screens. It takes a pretty decent gpu to actually make your windows screens move fast, while there's never a problem with the limited things a cpu does. Considering that xp doesn't take up very many resources, why the hell would it take a good amount of power just to get a windows screen to go up quickly...?
 

Sean618

Distinguished
Aug 22, 2006
109
0
18,680
Ok, my fault for and the last bit and yes I do mean Veterinarian. I just get really pissed off at people talking about something they have no idea about. I mean I don't mind if they say 'why are cpus so much more efficient then gpus?' but to state it as a fact without any data. Sure cpus are more efficient than gpus but as quite a few people pointed out, you can't compare them directly like that.
 

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
I know what you're gonna say; that a 9800 pro or an x800 could make windows move fast.

The fact is, it shouldn't even take a ati rage 4 meg to make a 2d, low power consuming, windows to load fast.

Lemme clarify, I don't mean ram loading, I mean menu loading, the type that is gpu dependant, like lowering a window.
 

Zellio

Distinguished
Sep 17, 2006
72
0
18,630
I don't get high powered gpus just to not use them.

my point of this topic isn't that I don't like gpus; it's that gpu makers are lazy and don't try hard enough.

They could do alot better.

But I seriously doubt they want too. They know they are our only option.
 

Sean618

Distinguished
Aug 22, 2006
109
0
18,680
Well if you really wanted more efficient then why have you got an X1900XT whereas the 7900GT is much more efficient. Let me guess because it was a lot more powerful. I doubt that ATi and Nvidia are lazy, as their in competition they are trying their hardest to better the other. The problem is if they divert their funds making a more efficient GPU instead of a faster one then everyone will buy the other makers faster one, like the X1900XT.
 
You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one.... prior to a Geforce there was no such thing as a GPU, it was a graphics card, the GPU brought us Hardware T&L, hardware texture and lighting. This was the first real acceleration of complex calculations by taking them away from the CPU and using dedicated hardware to do the work. This is more efficient than having a general puporse processor do the work. GPU's have progressed from there and taken more types of calculations into hardware SM3.0 etc. again these are more efficient than using a general purpose processor.

You need to understand that the processing power required to run a modern game with eye-candy at very high resolution. There is a limit as to how much parallelism (based on size of die and therefore fault tolerance, having to maintain results in-step, having sufficient access to data needed to perform calculations. etc.) can take place.

Market dynamics force Nvidia and Ati to try and outdo each other, it encourages the exact opposite of the lazy behaviour you are alluding to. If there were only one player in the market there would be no pressure to improve for then, and we'd get what they had to offer at the price that they set. As opposed to us getting something beyond what the competition has to offer at a price just above what they are offering it at. Which forces their price down, and forces them to develop further.

The issue with window moving is probably due to the bloatware that is XP, and will be further expanded on by Vista. My little 2mb S3 virge coped with windows at 1280*1024 on '95 quite happily.

You have fallen into the trap of not understanding efficiency, and thinking that high clocks are better. Higher clock speeds do not mean that your CPU is faster. In fact being able to process at the same speed with a lower clock speed is a sign of high efficency. Efficiency is work out / work in, so if a gpu takes 1 sec to render 50 frames at 100watts, and a CPU takes 10 seconds to render 50 frames at 100watts the GPU is 10x more efficient. Do you remember the old software only modes for games in the GF2 era, with the CPU doing the GPU's job the results were many frames slower. I think that one of the 3dMark tests used to do GPU rendered vs CPU rendered. and you'll see the frame rate and image quality difference as you are not comparing apples with apples, the GPU imagine is much much better.

How often do you actually NEED the 2-4 GPU's that you mention.
 
No problem. I'm getting very tempted to tell him to go back to school and accept that his friend has a bigger card than him, it'll prepare him for real life and car/wife/house/job/anatomical envy that he'll have for the rest of his life.
 

kaotao

Distinguished
Apr 26, 2006
1,740
0
19,780
I don't get high powered gpus just to not use them.

my point of this topic isn't that I don't like gpus; it's that gpu makers are lazy and don't try hard enough.

They could do alot better.

But I seriously doubt they want too. They know they are our only option.

:roll: What is your point? Oh, you don't have one. You and this thread make no sense.
 

kaotao

Distinguished
Apr 26, 2006
1,740
0
19,780
You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one.... prior to a Geforce there was no such thing as a GPU, it was a graphics card, the GPU brought us Hardware T&L, hardware texture and lighting. This was the first real acceleration of complex calculations by taking them away from the CPU and using dedicated hardware to do the work. This is more efficient than having a general puporse processor do the work. GPU's have progressed from there and taken more types of calculations into hardware SM3.0 etc. again these are more efficient than using a general purpose processor.

You need to understand that the processing power required to run a modern game with eye-candy at very high resolution. There is a limit as to how much parallelism (based on size of die and therefore fault tolerance, having to maintain results in-step, having sufficient access to data needed to perform calculations. etc.) can take place.

Market dynamics force Nvidia and Ati to try and outdo each other, it encourages the exact opposite of the lazy behaviour you are alluding to. If there were only one player in the market there would be no pressure to improve for then, and we'd get what they had to offer at the price that they set. As opposed to us getting something beyond what the competition has to offer at a price just above what they are offering it at. Which forces their price down, and forces them to develop further.

The issue with window moving is probably due to the bloatware that is XP, and will be further expanded on by Vista. My little 2mb S3 virge coped with windows at 1280*1024 on '95 quite happily.

You have fallen into the trap of not understanding efficiency, and thinking that high clocks are better. Higher clock speeds do not mean that your CPU is faster. In fact being able to process at the same speed with a lower clock speed is a sign of high efficency. Efficiency is work out / work in, so if a gpu takes 1 sec to render 50 frames at 100watts, and a CPU takes 10 seconds to render 50 frames at 100watts the GPU is 10x more efficient. Do you remember the old software only modes for games in the GF2 era, with the CPU doing the GPU's job the results were many frames slower. I think that one of the 3dMark tests used to do GPU rendered vs CPU rendered. and you'll see the frame rate and image quality difference as you are not comparing apples with apples, the GPU imagine is much much better.

How often do you actually NEED the 2-4 GPU's that you mention.

Word. :)
 
With AMD for many years and now Conroe, cpus are quiet, small, effiencient, don't use alot of power, don't get hot, and are very powerful.

Gpus on the other hand keep getting bigger, are loud, run hot, are NOT efficient when you compare them to cpus...

If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it. I'm not talking a dual core or a a pc with two chips, a pc with a cpu and memory...

My allendale runs at 41c OVERCLOCKED. my motherboard runs at 31c. My allendale is FASTER then my x1900xt 512 and yet the damn thing runs at 60c. And now we hear that the new ones will be twice as hot..

I understand your rant but think your dismay is mis-directed. GPU's and video cards are specialized hardware, and for the work they perform, they are very efficient and run at reasonable temperatures.

With motherboard interconnects and memory speeds as they are today, having a socketed CPU performing GPU functions would not result in a more efficient machine. Now, a socketed GPU with GDDR4 memory and Hypertransport 3.0 might have the potential to perform as well or or better than a dedicated GPU and video card, but that technology would have to be R&D'ed to determine if it is a vialble solution.