Sign in with
Sign up | Sign in
Your question

+++ Is it possible for nVIDIA & ATI to...

Last response: in Graphics & Displays
Share
December 6, 2006 1:37:07 PM

Ok I'm not the technical kind of person

Is it possible for nVIDIA & ATI to actually reduce the power consumption of these new DX10 cards.. with over 600 million transistor count they consume abt 145watt each, does this mean the newer, better upcoming cards say 8900GTX will consume 180W and continue to go up and up?

my qn is, is it possible at all that nVIDIA and ATI can actually do something abt this prob? or is it impossible base on the theory of 'more transistor = more power'

this is bad for us, consumers -our eletric bills, and worst - rapid depletion of the limited fossil fuels (lol ok i'm gettin too far off)

More about : nvidia ati

December 6, 2006 2:15:22 PM

They do. Radeons and Geforces will automatically clock-throttle back when they aren't under load. If you're not playing a game they use much less power.

As far as gaming, if you want more power efficiency, you need a slower card. You have to pay to play I'm afraid.
December 6, 2006 2:16:18 PM

smaller chip technology reduces power consumption. The same 90nm chip at 65nm takes a lot less power. Thats one reason why Core2 uses less power than Pentium4.

You'll see power consupmtion go down if/when they move to smaller tech. however, they'll probably use it right back up again with more transistors.
Related resources
December 6, 2006 2:18:29 PM

Quote:
Ok I'm not the technical kind of person

Is it possible for nVIDIA & ATI to actually reduce the power consumption of these new DX10 cards.. with over 600 million transistor count they consume abt 145watt each, does this mean the newer, better upcoming cards say 8900GTX will consume 180W and continue to go up and up?

my qn is, is it possible at all that nVIDIA and ATI can actually do something abt this prob? or is it impossible base on the theory of 'more transistor = more power'

this is bad for us, consumers -our eletric bills, and worst - rapid depletion of the limited fossil fuels (lol ok i'm gettin too far off)


Die shrinks, but don't expect power to drop dramatically as with die shrinks speed increases so the usual power benefit is offset
December 6, 2006 2:49:08 PM

I think that's a pretty valid point. I hope GPU manufacturers think about this before we need 2kw PSUs :D .

It seems like people are starting to look more at performance-per-watt measurements (Intel's Core 2 Duo seems like proof of that), so maybe nVidia/ATI will come around and do a more efficient redesign.
December 6, 2006 3:15:27 PM

die shrink, and maybe in R700 when AMD decides to use SOI for Radeons.. at that time maybe 45nm
Anonymous
a b U Graphics card
December 6, 2006 3:58:25 PM

There was some article before the 8800 came out that said that these would be the 'lowest' performance/watt we should see. I don't know how accurate that is but any expect GPU maker to do the way CPU maker went (a few years ago for K8) and recently for C2D.

Modular design should help giving different performance based on the heat/power level you want.

I personally think there will always be a market for people who don't mind 2kW and triple slot cooler as long as they get a performance boost out of it. Most of the time, the las 5-10% of performance is pretty costly in terms of heat.
December 6, 2006 4:50:16 PM

Quote:
They do. Radeons and Geforces will automatically clock-throttle back when they aren't under load. If you're not playing a game they use much less power.


I wonder if the new Vista graphics stuff will cause the GPU to stay in 3d mode all the time now and consume high power even when you're not gaming.
December 6, 2006 5:34:15 PM

Maybe in the next die shrink
December 6, 2006 6:14:25 PM

In my opinion graphic company are about 12-24 months behind cpu ones. At least as related to fab process. So I'd say they are just about now working on more efficient VPU for the same performances. Just like AMD is adressing the problem right now and Intel 6 months ago.

Now about the famous electrical bill. If such a VPU consume 150W as oppose to 50W for your old one, that's 100W more. If your computer is used on average 3 hours per day for gaming, that would make it 300WHr (Watt/hour, just made that up by the way). If you only replace one of your incendescent light bulb of 100W with one low energy one at 23W for same light output, you save 77W per hour. So if such a light bulb is used for 3.9 hours a day, OK maybe 5 hours if counting for PSU inneficiency, you just gained back that extra consumtion. If you take into account that such light bulb cost 3 time the price but last 5 time longer, their is no extra cost for you doing so.

Hey, I just solved the problem of inflating electrical bill for all of you guys. :lol:  No need for any thanks by the way.. :wink: I only want to show that the difference is not as huge as we think it is.

The real problem I think is that some poeple will need to upgrade their PSU, and that's a pain.

Like I say below, it was my 2 cents of the day. Let me know if you think I'm wrong.
December 6, 2006 7:17:05 PM

In my opinion, the GPU folks are pulling a similar stunt right now to what the CPU folks did years go. In the CPU world they were just throwing MHz at the speed problem and now the GPU folks are just throwing transisters at the speed problem. It cannot go on like this indefinitely, they WILL have to throttle their energy usage back simply because most households won't be wired to support that kind of power usage on a single circuit. Having to isolate their electronic components on separate circuits won't make many people happy.

Someone will come up with a better way to do the GPU and everyone else will have to play catch up. Right now, both ATI and nVidia are just trying to milk what they can out of their current technology and delay the inevitable redesign costs. Let's hope they don't pull an Intel and hold on to their "Netburst technology" too long.

EDIT: And if they don't watch it, someone might just start adding GPU technology as extra "cores" on their processors. It could happen.
December 6, 2006 10:26:43 PM

I totally agree with you. That's exactly how I started my opinion.

What I said is that the problem is not so bad already! Considering that a high end PC with (extreme case) A64FX74 and dual GeF8800GTX would consume at worst 700Watts. And that's absolute maximum. In less then 12 months from now AMD will have a much better performing solution for less Watts consume.

Anyway, this 700Watts translate to max 1000 watts to the power outlet. Considering PSU efficiency is in the 80-90% range, and not 70%, real number would be closer to 850Watts by the way. So, that 1000 watts on 120 volts translate to 8.4 Amp (or 7Amp for better efficiency). If most house are like the ones here in Quebec, they have at least 15Amp or even 20 Amp breakers. That leaves from 6.6 (8 ) to 11.6 (14) Amp free. Unless their is a microwave or a toaster on the same power circuit, nothing is out of control yet.

But it's absolutly true that these companies will have to do something about it, if not already on the case. AMD might be ATI savior in this area, and Intel might uses it's exellent fab process to join the fight in high end VPU that don,t consume too much. Who knows. :wink:
December 6, 2006 11:55:13 PM

gone are the days where PC consumes no more than 100W.. now it's x10 lol...

anyway nVIDIA should merge with intel.. share/steal their chip makin tech.....
December 7, 2006 12:15:03 AM

Quote:
gone are the days where PC consumes no more than 100W.. now it's x10 lol...

anyway nVIDIA should merge with intel.. share/steal their chip makin tech.....


nah, man, nVIDIA should merge with intel who should merge with ati/amd, who should then proceed to merge with google, and then with mittal steel, and haliburton, and then they should merge with rosi o'donnel, and then playboy, and then they would rule teh world. not to mention the fact that videocards would then run off of carebear love, rather than electrizity.

uh-huh. thats what should happen.

why are there three "+"s in the thread name?
December 7, 2006 12:59:04 AM

A few days ago on THG there was an aricle on a new design for transistors.
Reports say it can half pwr consumption, but wont be ready 4 production for prob 10 years or so. So for the long range future, pwr consumption looks to stay under control, however as with anything, the balance WILL vary with time. P4's drank the juice, while C2D's dont.

8800's also drink the juice, but the next gen cards will prob not.

It will, as always, be a little bit of everything that helps. More efficient psu's can also help out.

As for 2kw psu's... i dont really see it happening. I think there will be substantial reductions to prevent that. Also... with the way regulations are going i wouldnt be suprised if legislations are passed preventing any single household device from consuming certain ammounts of pwr, or similar.

In short.. Yes, ATI and Nvidia WILL produce more pwr efficient cards. Hopefully they should be available 18-24 months from now.
December 7, 2006 2:02:11 AM

I believe that eventualy NVIDIA and ATI will both come up with a power saving feature at the very least.Or the better one,hopefully they'll come up with a high performance card that does not use so much power.Anyways,I think the best answer is to ask both companies that question and see what answer you get.Goodluck.

Dahak

primary gaming rig
AMD X2-4400+@2.4 S-939
EVGA NF4 SLI MB
2X EVGA 7800GT CO IN SLI
2X1GIG DDR IN DC MODE
WD300GIG HD
EXTREME 19IN.MONITOR 1280X1024
ACE 520WATT PSU
COOLERMASTER MINI R120

secondary gaming rig
GYGABYTE MB AGP
AMD X2 3800+ S-939
2X512 DDR IN DC MODE
X1650PRO 512 AGP
17IN.MONITOR
MAXTOR 120GIG HD
450WATT PSU
December 7, 2006 2:30:03 AM

Quote:
In my opinion, the GPU folks are pulling a similar stunt right now to what the CPU folks did years go. In the CPU world they were just throwing MHz at the speed problem and now the GPU folks are just throwing transisters at the speed problem. It cannot go on like this indefinitely, they WILL have to throttle their energy usage back simply because most households won't be wired to support that kind of power usage on a single circuit. Having to isolate their electronic components on separate circuits won't make many people happy.

Someone will come up with a better way to do the GPU and everyone else will have to play catch up. Right now, both ATI and nVidia are just trying to milk what they can out of their current technology and delay the inevitable redesign costs. Let's hope they don't pull an Intel and hold on to their "Netburst technology" too long.

EDIT: And if they don't watch it, someone might just start adding GPU technology as extra "cores" on their processors. It could happen.


It's true that video chip makers are sort of doing the "mhz" analogy you made. However, CPU makers didn't manage to make new generations of CPU's twice as fast w/ mhz as GPU makers are doing now with more shader units.

What I mean is, GPU's keep advancing significantly with each generation... CPU's didn't. Pentium 4 generations were barely faster than the previous and the Athlon didn't do crap for years until the A64.

And someone said it earlier... when GPU's move to 65nm or 70nm, things will be a lot better. Right now chip makers are at their limit w/ 90nm, so they have to move to 65nm (or there abouts) very soon. I'd venture to say that we won't see power requirements again like the 8800GTX for years to come. R600 might be as bad... but again that'll be the end for a while. 8800 series is sort of the straw ON the camel's back.
December 7, 2006 2:58:51 AM

Actually CPU's are behind about 2 year or more in term of performance. The high-end GPU's are crazy fast. In term of size you are correct.

The power consumption problem does need to solved though.
!