Sign in with
Sign up | Sign in
Your question

Why do we accept the garbage that gpu makers spew?

Last response: in Systems
Share
September 18, 2006 1:10:23 PM

With AMD for many years and now Conroe, cpus are quiet, small, effiencient, don't use alot of power, don't get hot, and are very powerful.

Gpus on the other hand keep getting bigger, are loud, run hot, are NOT efficient when you compare them to cpus...

If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it. I'm not talking a dual core or a a pc with two chips, a pc with a cpu and memory...

My allendale runs at 41c OVERCLOCKED. my motherboard runs at 31c. My allendale is FASTER then my x1900xt 512 and yet the damn thing runs at 60c. And now we hear that the new ones will be twice as hot..
September 18, 2006 1:52:40 PM

Perhaps the polite thing to do would be for the gaming industry and the GPU manufactures to tone it all down... because you want a cool and quiet gpu.

What if the demands of the CPUs today, were astronomical? Don't you think that CPU makers would be forced to run their products hotter to squeeze as much performance out... leading HSF producers to create larger, louder and more powerful products to cool them?

I think what you demand is unreasonable.

GPUs ARE cool and quiet my friend, but only in hindsight. By that I mean sit on your hands... in two years buy a video card and under clock the hell out of it and play a game that today stresses GPUs. Your hindsight rig will be cool and quiet, I assure you.

:roll:
September 18, 2006 2:06:03 PM

You call new GPUs are not so damn hot ha!!!!!
My old mid range X1600 goes up to 55C.. But my P D 830 running max 52C when i run prime95+ BF2 with stock cooler.So how can i agree with you brother.
September 18, 2006 2:16:02 PM

Simple answer:

Users have decided that eye candy sales games. Screw the actual content of the game, if it looks damn look on high settings (bragging rights too), people buy it.

So, what do you GPU makers do? Produce cool and quiet products that cannot play the games the way people want them? Or produce cool and quiet GPUs that cost $1000 due to all the engineering?

Obviously- since neither of those options are viable, the produce GPUs that run hot with lots of fans. It is all about the bottom line. There is no reason to change with game companies pay you to design your GPU to work better in their games, and users snap it up and fall for such marketing crap as QuadSLI.
September 18, 2006 2:19:59 PM

It's fairly obvious that you don't have any idea when it comes down to the actual electronics and workings behind the components in the computer and neither do I and neither do most of the people on this forum. You can only make those comments if you actually understand how they work.

Quote:
If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it.

Where the hell do you get this ****. I very rarely swear, but seriously you have NO facts. I mean I agree to the outsider it looks as if the CPU companies are much better but we know nothing about it. Don't you think that if two major companies are fighting all out to make the fastest graphics card they would develope the best they can with the technology available, I mean if they could just use a CPU and just redesign it a bit wouldn't they?

Maybe the GPU companies should make the ram as they are already using GDDR 4 which I sure you can 'guarantee' is faster.

This **** goes on all the time, I hear people saying how the vets ripped them off, yeah well my parents are vets and they get a **** salary for someone who has to go through 7 years at university, has to work 8 til 7 and has to be on duty every other day incase of an emergency. They have one of the highest suicide rates and stress level. Don't complain about what you don't know!
September 18, 2006 2:47:23 PM

Quote:
This **** goes on all the time, I hear people saying how the vets ripped them off, yeah well my parents are vets and they get a **** salary for someone who has to go through 7 years at university, has to work 8 til 7 and has to be on duty every other day incase of an emergency. They have one of the highest suicide rates and stress level. Don't complain about what you don't know!


Wow, nice analogy. I'm pretty sure the complaint that started this thread was about gpu manufacturers releasing hot, noisy products, not about how many hundreds of $ the vet charges for short, simple procedures.
In any case, if it's too much work for the small reward, don't become a veterinarian and if you find video cards too hot and noisy you can always buy cheaper, cooler, and quieter - or not at all.
September 18, 2006 3:33:14 PM

It's not really about them being hot and noisy. It's the fact that they are a combination of all that while cpus are much much more efficient.
September 18, 2006 3:37:57 PM

The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...
September 18, 2006 4:03:38 PM

Just my 2 cents on the situation, I gotta run quick so I didn't have time to read much but heres my answer: a dual-core CPU from AMD runs at 2.4 GHz, with TWO cores. The rough equivelant from a Graphics card is a high end video card with 24 pipes running at 500 MHz. Bad comparison, but the GPU is optimized for video games. If you had your CPU alone rendering all those scenes, I doubt you'd get 1/10th the fps you would with the help of a video card.

I know how bad of analogy it is to compare an entire CPU core to a single pipeline in a GPU, but look at a 500 MHz GPU with 2 pipelines. They consume like 10 Watts tops and can use passive heatsinks. Don't forget that the 'stock' HSF on your CPU is probably bigger then the one on your video card!! I know mine is.

Go ahead and flame me all if you want if you think my views are a bit skewed (Which I think they might be..) But I'm off to go play golf. Have fun all.
a c 79 à CPUs
September 18, 2006 4:14:36 PM

Quote:
The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...


I don't know why you think that CPU's are 'faster' or 'more powerful or 'more efficient' than GPU's.

GPUs do a very specific task, they do it very quickly in parallel i.e. doing lots of it at the same time, CPU's have only just gotten (past 2 years or so) to doing two things at the same time. If you want a comparison of power levels of a GPU compared to a CPU think about this.

ATI (definately) and Nvidia (I believe) are planning to use spare GPU processing power to do physics calculations in response to ageia. These are doing calculations that the CPU can't do quickly enough which created the need for Ageia in the first place. Yes at the moment the Ageia drivers are not very good, but they are improving and CPU on its own can only run the demo's without eye-candy turned on.

If you are referring to the fact that GPU's are hotter and need more power, then yes they do, but having so much to do in parallel, will cause a lot of heat to be generated. Also with a GPU most of data is changing every 1/60th of a second, whereas for a CPU running a couple of Apps most of the data is constant, even moving data to and from memory creates heat.

You can run most things with a low end card, I'm running XP with an old Ti4800 and its very happy. But... to play any games you will need significant amounts of processing power, hence the arms race between Nvidia and ATi to provide the greatest Video processing power on a card. So the fact that people want to play games is forcing the high end GPU to exist.

You chose to buy a 1900xt, it wasn't needed in order to run the PC, or infact to run most games, most of them are perfectly happy on much older hardware. I'm running BF2 and Oblivion at 1280x1024 on a 6600GT and they look fine to me.
September 18, 2006 4:24:07 PM

What the heck????? By vet, do you mean veterinarian????

Holy cow, this thread is weird.
September 18, 2006 4:46:55 PM

Quote:
The only reason I can guess to this is, while cpu makers are held accountable (They are sold to everyone, and get blamed for lots of stuff), they must keep their product efficient...

Gpu makers sell their product to the high elite, and have the market cornered. They can do what they want with their product because they know their is no other option for high end video gamers, and apparantly, they do...

Likewise, while their aren't many high end gpu options for buyers, their are PLENTY of options for cpus...

EDIT: Their is around the same amount of high end cpus as gpus, but the fact is the average person buys a high end cpu, not a high end gpu...


I don't know why you think that CPU's are 'faster' or 'more powerful or 'more efficient' than GPU's.

GPUs do a very specific task, they do it very quickly in parallel i.e. doing lots of it at the same time, CPU's have only just gotten (past 2 years or so) to doing two things at the same time. If you want a comparison of power levels of a GPU compared to a CPU think about this.

ATI (definately) and Nvidia (I believe) are planning to use spare GPU processing power to do physics calculations in response to ageia. These are doing calculations that the CPU can't do quickly enough which created the need for Ageia in the first place. Yes at the moment the Ageia drivers are not very good, but they are improving and CPU on its own can only run the demo's without eye-candy turned on.

If you are referring to the fact that GPU's are hotter and need more power, then yes they do, but having so much to do in parallel, will cause a lot of heat to be generated. Also with a GPU most of data is changing every 1/60th of a second, whereas for a CPU running a couple of Apps most of the data is constant, even moving data to and from memory creates heat.

You can run most things with a low end card, I'm running XP with an old Ti4800 and its very happy. But... to play any games you will need significant amounts of processing power, hence the arms race between Nvidia and ATi to provide the greatest Video processing power on a card. So the fact that people want to play games is forcing the high end GPU to exist.

You chose to buy a 1900xt, it wasn't needed in order to run the PC, or infact to run most games, most of them are perfectly happy on much older hardware. I'm running BF2 and Oblivion at 1280x1024 on a 6600GT and they look fine to me.

I couldn't agree more in regards to the arms race between Nvidia & ATi. You can run a number of games at 1280 x1024 with a card that is a year or two old, but there are people who want more, faster, brighter, crisper.

Years ago when 3dfx was fiddling with the idea of having a seperate power supply for their top end graphics card people thought they were insane. Now with the frame rates, and high resolutions that are available it almost sounds reasonable.

It all boils down to whether or not the public will buy the next generation of high Temperature/Power Consumption GPUs....my guess is they will.
a c 79 à CPUs
September 18, 2006 4:48:10 PM

Quote:
What the heck????? By vet, do you mean veterinarian????

Holy cow, this thread is weird.


Yes a very odd thread, I assume that you are not talking to me about the Vet's though. I have always found it odd that vets have a greater training period than doctors. I suppose that they work on more species though, and the species can't talk (mostly) and all diagnosies have to be inferred from very hidden symptoms.
September 18, 2006 4:56:16 PM

It isn't because gaming is not what a cpu is intended to be.

The gpu does all gaming tasks. it's always better to have seperate processign units doing seperate things.

However, thats not to say that the gpu is in any way efficient.Not only does it sometimes require 2-4 gpus, they are always lcoked lower then cpus, run hotter then, and have higher ram then cpus...

Yet they run things worse then a cpu does.

Yes, a completely unfair advantage, a cpu that is running 2d screens versus a gpu that runs while the cpu helps it. However, all that is required of a cpu is a large amount of cooling on top of it. For a gpu, it must be boosted up with extensive cooling, better ram, 2 or more gpus, and then?

One other thing; about the 2d screens. It takes a pretty decent gpu to actually make your windows screens move fast, while there's never a problem with the limited things a cpu does. Considering that xp doesn't take up very many resources, why the hell would it take a good amount of power just to get a windows screen to go up quickly...?
September 18, 2006 4:59:54 PM

Ok, my fault for and the last bit and yes I do mean Veterinarian. I just get really pissed off at people talking about something they have no idea about. I mean I don't mind if they say 'why are cpus so much more efficient then gpus?' but to state it as a fact without any data. Sure cpus are more efficient than gpus but as quite a few people pointed out, you can't compare them directly like that.
September 18, 2006 5:00:44 PM

I know what you're gonna say; that a 9800 pro or an x800 could make windows move fast.

The fact is, it shouldn't even take a ati rage 4 meg to make a 2d, low power consuming, windows to load fast.

Lemme clarify, I don't mean ram loading, I mean menu loading, the type that is gpu dependant, like lowering a window.
September 18, 2006 5:11:51 PM

I don't get high powered gpus just to not use them.

my point of this topic isn't that I don't like gpus; it's that gpu makers are lazy and don't try hard enough.

They could do alot better.

But I seriously doubt they want too. They know they are our only option.
September 18, 2006 5:27:30 PM

Well if you really wanted more efficient then why have you got an X1900XT whereas the 7900GT is much more efficient. Let me guess because it was a lot more powerful. I doubt that ATi and Nvidia are lazy, as their in competition they are trying their hardest to better the other. The problem is if they divert their funds making a more efficient GPU instead of a faster one then everyone will buy the other makers faster one, like the X1900XT.
a c 79 à CPUs
September 18, 2006 5:31:03 PM

You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one.... prior to a Geforce there was no such thing as a GPU, it was a graphics card, the GPU brought us Hardware T&L, hardware texture and lighting. This was the first real acceleration of complex calculations by taking them away from the CPU and using dedicated hardware to do the work. This is more efficient than having a general puporse processor do the work. GPU's have progressed from there and taken more types of calculations into hardware SM3.0 etc. again these are more efficient than using a general purpose processor.

You need to understand that the processing power required to run a modern game with eye-candy at very high resolution. There is a limit as to how much parallelism (based on size of die and therefore fault tolerance, having to maintain results in-step, having sufficient access to data needed to perform calculations. etc.) can take place.

Market dynamics force Nvidia and Ati to try and outdo each other, it encourages the exact opposite of the lazy behaviour you are alluding to. If there were only one player in the market there would be no pressure to improve for then, and we'd get what they had to offer at the price that they set. As opposed to us getting something beyond what the competition has to offer at a price just above what they are offering it at. Which forces their price down, and forces them to develop further.

The issue with window moving is probably due to the bloatware that is XP, and will be further expanded on by Vista. My little 2mb S3 virge coped with windows at 1280*1024 on '95 quite happily.

You have fallen into the trap of not understanding efficiency, and thinking that high clocks are better. Higher clock speeds do not mean that your CPU is faster. In fact being able to process at the same speed with a lower clock speed is a sign of high efficency. Efficiency is work out / work in, so if a gpu takes 1 sec to render 50 frames at 100watts, and a CPU takes 10 seconds to render 50 frames at 100watts the GPU is 10x more efficient. Do you remember the old software only modes for games in the GF2 era, with the CPU doing the GPU's job the results were many frames slower. I think that one of the 3dMark tests used to do GPU rendered vs CPU rendered. and you'll see the frame rate and image quality difference as you are not comparing apples with apples, the GPU imagine is much much better.

How often do you actually NEED the 2-4 GPU's that you mention.
September 18, 2006 5:40:13 PM

I was just about to edit mine and talk about Windows :lol: 
a c 79 à CPUs
September 18, 2006 5:44:14 PM

No problem. I'm getting very tempted to tell him to go back to school and accept that his friend has a bigger card than him, it'll prepare him for real life and car/wife/house/job/anatomical envy that he'll have for the rest of his life.
September 18, 2006 5:49:13 PM

Quote:
I don't get high powered gpus just to not use them.

my point of this topic isn't that I don't like gpus; it's that gpu makers are lazy and don't try hard enough.

They could do alot better.

But I seriously doubt they want too. They know they are our only option.


:roll: What is your point? Oh, you don't have one. You and this thread make no sense.
September 18, 2006 5:52:50 PM

Quote:
You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one.... prior to a Geforce there was no such thing as a GPU, it was a graphics card, the GPU brought us Hardware T&L, hardware texture and lighting. This was the first real acceleration of complex calculations by taking them away from the CPU and using dedicated hardware to do the work. This is more efficient than having a general puporse processor do the work. GPU's have progressed from there and taken more types of calculations into hardware SM3.0 etc. again these are more efficient than using a general purpose processor.

You need to understand that the processing power required to run a modern game with eye-candy at very high resolution. There is a limit as to how much parallelism (based on size of die and therefore fault tolerance, having to maintain results in-step, having sufficient access to data needed to perform calculations. etc.) can take place.

Market dynamics force Nvidia and Ati to try and outdo each other, it encourages the exact opposite of the lazy behaviour you are alluding to. If there were only one player in the market there would be no pressure to improve for then, and we'd get what they had to offer at the price that they set. As opposed to us getting something beyond what the competition has to offer at a price just above what they are offering it at. Which forces their price down, and forces them to develop further.

The issue with window moving is probably due to the bloatware that is XP, and will be further expanded on by Vista. My little 2mb S3 virge coped with windows at 1280*1024 on '95 quite happily.

You have fallen into the trap of not understanding efficiency, and thinking that high clocks are better. Higher clock speeds do not mean that your CPU is faster. In fact being able to process at the same speed with a lower clock speed is a sign of high efficency. Efficiency is work out / work in, so if a gpu takes 1 sec to render 50 frames at 100watts, and a CPU takes 10 seconds to render 50 frames at 100watts the GPU is 10x more efficient. Do you remember the old software only modes for games in the GF2 era, with the CPU doing the GPU's job the results were many frames slower. I think that one of the 3dMark tests used to do GPU rendered vs CPU rendered. and you'll see the frame rate and image quality difference as you are not comparing apples with apples, the GPU imagine is much much better.

How often do you actually NEED the 2-4 GPU's that you mention.


Word. :) 
a b à CPUs
September 18, 2006 6:00:45 PM

Quote:
With AMD for many years and now Conroe, cpus are quiet, small, effiencient, don't use alot of power, don't get hot, and are very powerful.

Gpus on the other hand keep getting bigger, are loud, run hot, are NOT efficient when you compare them to cpus...

If you were to instead have a pc with another cpu as a gpu you'd have a much more efficient pc i gurrantee it. I'm not talking a dual core or a a pc with two chips, a pc with a cpu and memory...

My allendale runs at 41c OVERCLOCKED. my motherboard runs at 31c. My allendale is FASTER then my x1900xt 512 and yet the damn thing runs at 60c. And now we hear that the new ones will be twice as hot..


I understand your rant but think your dismay is mis-directed. GPU's and video cards are specialized hardware, and for the work they perform, they are very efficient and run at reasonable temperatures.

With motherboard interconnects and memory speeds as they are today, having a socketed CPU performing GPU functions would not result in a more efficient machine. Now, a socketed GPU with GDDR4 memory and Hypertransport 3.0 might have the potential to perform as well or or better than a dedicated GPU and video card, but that technology would have to be R&D'ed to determine if it is a vialble solution.
a c 79 à CPUs
September 18, 2006 6:01:33 PM

Quote:
You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one....

How often do you actually NEED the 2-4 GPU's that you mention.


Word. :) 

My first - Word. (I assume its a good thing?) :D 
September 18, 2006 6:06:31 PM

Quote:
You appear to be talking to yourself right now, so I'll help you out a bit.

If you don't like GPU's, don't have one....

How often do you actually NEED the 2-4 GPU's that you mention.


Word. :) 

My first - Word. (I assume its a good thing?) :D 

Word...I mean yes.
September 18, 2006 6:17:08 PM

you have to realize that gpu's have much more processing power than regular cpus do. If a cpu were to run as a gpu it would probably die within a week because cpu's do not handle graphical information very well. It only makes sense that something with much more processing power would run that much hotter and until the die size is decreased yet again we are not going to see any decreases in heat output.
September 18, 2006 6:36:32 PM

Quote:
It isn't because gaming is not what a cpu is intended to be.

The gpu does all gaming tasks. it's always better to have seperate processign units doing seperate things.

However, thats not to say that the gpu is in any way efficient.Not only does it sometimes require 2-4 gpus, they are always lcoked lower then cpus, run hotter then, and have higher ram then cpus...

Yet they run things worse then a cpu does.

Yes, a completely unfair advantage, a cpu that is running 2d screens versus a gpu that runs while the cpu helps it. However, all that is required of a cpu is a large amount of cooling on top of it. For a gpu, it must be boosted up with extensive cooling, better ram, 2 or more gpus, and then?

One other thing; about the 2d screens. It takes a pretty decent gpu to actually make your windows screens move fast, while there's never a problem with the limited things a cpu does. Considering that xp doesn't take up very many resources, why the hell would it take a good amount of power just to get a windows screen to go up quickly...?


First, you cannot compare CPU and GPU industry as they are entirely different things. CPU makers (intel, AMD) can have time for research and development for a new core architecture-because CPU is not required to change rapidly all the time- while GPU makers like Nvidia and ATI focused on an entirely different market (Gaming) which demand newer product to meet the games requirement and better gaming experience. In CPU market, now CPU basically have already exceeded all the things you really need, but not in GPU market case, games developer always create new games that can put your GPU to a crawl (Oblivion, and will be Crysis). But GPU makers don't have enough time to make a better architecture like that.

You can argue that they dont have to throw new high end card so oftenly, but the thing is, if they only create mid range card, then you wouldnt complain about the heat, the noise and power consumption. Then at that time, you have no such game like Crysis and Oblivion IV to play, because the hardware cannot handle the games.

The average product cycle of a new high end card is only about 6 months. If you think it is the manufacturers fault and dubios scheme to get consumer money then you dont do any research at all. Nvidia and ATI only make about $$ million dollar in profit, while AMD and Intel make hundreds of million. It's still very costly to produce an video card even if they just make crappy cards since video card requires the processor, the board and the memory (like a computer itself). GPU makers focuses on a much smaller markets-not every computer needs a discrete video card, but every computer must have a CPU-but theyare forced to reduce their product cycle due to market demand. It's much harder in GPU industry than you think it is. If you remember a while back in 1998 when Intel tried to jump into the graphics market, they failed terribly and has to move into the integrated graphics chip. Now I ask you, do you want integrated graphics chip?

With all of the above being said, GPU makers (Nvidia and ATI) focused on a market that require incredibly rapid change, but making a new core architecture is very hard and time consuming. That's the reason why they have to crank up the power usage to make their product more powerful compared to their competitors. So then there is your consumer complain about the card noisy and power consumption. Please remember, nobody forces you to buy new card to play Crysis or new games, it's you who always wants more in a short amount of time. If no one wants to play Crysis or Oblivion with the highest possible settings, then ATI and Nvidia won't have to make new card every 6 months, so stop complaining.
September 18, 2006 6:55:50 PM

I don't even know if all of that GPU's are actually better than CPU talk is even necessary. I bet if you put a GPU cooler on your CPU, it'd be just as hot.

Same with a CPU cooler on a GPU.

I don't think any of us like how hot our graphics cards get, but your argument is based on nothing at best.
September 18, 2006 6:56:26 PM

Quote:

Gpus on the other hand keep getting bigger, are loud, run hot, are NOT efficient when you compare them to cpus.....


Gee, my X1900 XTX is so quiet that I can't hear it, and so cool that it rarely hits 50c. Oh, its watercooled and that cost an extra $100. Must be a dumb company that makes it. Oh yes, its also quieter then the cpu and fan that I have. So I don't think its the company's fault if your gpu is so loud, hot, and inefficient. Its probably your fault that you didn't buy a quiet, cool, and efficient gpu in the first place.
September 18, 2006 8:05:50 PM

Quote:
It's not really about them being hot and noisy. It's the fact that they are a combination of all that while cpus are much much more efficient.


CPU are more efficient in general. But try only once to run FEAR (or any modern game) graphic on a 2S Woodcrest at 3.0GHZ and 1.33GHZ fsb (one for AI and physic and the other cpu for graphic) and you'll get frame rate worst than even an FX5500.

This is because CPU are general purpose circuit as oppose to VPU that are built with the sole purpose of "drawing" graphic. The difference will shortly blur with DX10 VPU coming out shortly, but it'll stay. Any chip built with 1 thing in mind will have no flexibility whatsoever, but it'll be damn fast at what it's design for.

I personally think that with upcoming cpu architecture, we'll have more modular design with the core of the cpu telling specialized part what to do. We already have this with new PPU (physic processing unit), hardware video decoding on VPU, and even AI specialized unit in not so distant future. These are all thing that used to be done on CPU but their limitation push some hardware company to develop specialized hardware for them.

Now, imagine CPU line (maybe upcoming NEHALM with it's modular design) with embedded hardware decoding of H.264 codecs, etc... What about Core2QuatroMultimedia or Core2QuatroGamers? New version of Direct X come out? Let's bring a new Core2QuatroGamers with updated VPU core. Don't want to buy a new CPU but the VPU embedded on the core is getting too weak? Disable that part of the CPU and purchased a new seperated VPU. that would be hell for Intel and AMD with 2-3 time the SKU to keep in stock, but I think it's coming in some sort of way.

But in the end, any specialized chip will beat even a Core2Duo X6800 at what it's designed for, but don't try anything else with it.

I agree that both ATI and NVidia have lesson to take from both AMD and Intel on that front. Maybe soome sort of disabling of unused unit like Core2Duo does. This would help. Most poeple don't listen to a hi-def movie while playing HL2:Episode1. It would help to disable this video decoding part to save some watts at least. Game use no PS3.0 code, disable that part of the pipeline. You see my point. I think that's where ATI (with AMD help) and NVidia will be going very soon.
September 18, 2006 9:25:09 PM

At least edit your name. You have it spelled wrong. :lol:  :wink:
September 19, 2006 12:22:30 AM

who cares if you want to play the newest games and you want it on max then you are going to have to get have some sacrafises and i am not even talking about the prices lol... as for temps my gf 6600 scilencer is at an average temp of 48C and max 54C when gaming :) 
GOD i do love asus :) 
!