A QUESTION ABOUT POWER CONSUMPTION

HolyDoom Witch

Distinguished
Nov 24, 2008
85
0
18,630
H!

I have an HD 4870 (Sapphire 512mB; I might as well add, a September 2008 model - so old :cry: ). As you know, the stock clocks are 750/900. Somehow, I recently installed this card (this month).

While on desktop, the core clock comes down to 500. As soon as a 3D application is opened, the clock jumps to the stock speed, or the overclocked speed. Memory clock remains the same.

Now here, what I have done is, that I have UNDERclocked it. I have the card underclocked to 500 core, and 450 memory, permanently (applied profile).

Now this is a good thing with the card, that as soon as a 3D application launches, probably the memory clock also adjusts automatically to the stock speed. (Otherwise I wouldn't be able to play Terminator Salvation and TR8 at 100+ frame rates with my E2140 @ stock 1.6ghz!)

At stock speeds, this card is supposed to draw 165 watts maximum (support ticket from Sapphire). But support people don't tell the minimum wattage consumption. But I understand that higher the clock speeds, higher the wattage consumption.

Can anyone tell what is the wattage consumption of my card at idle in the current scenario?

Thanks

HDW
 

spinny

Distinguished
Jun 4, 2006
274
0
18,810
If you have onboard graphics, you could measure system power draw with and without the video card in I guess. I'm not too familiar with advanced power measurements.

I'm interested to know the point of getting this information though. Care to share?
 

HolyDoom Witch

Distinguished
Nov 24, 2008
85
0
18,630
May be all the people start underclocking their GPUs permanently for desktop use, and it becomes a factor in load shedding! :lol:

But somehow I also thought that this was a really good step, towards saving a LOT of power, especially if combined with the CPU. Not that I am not concerned about my own electricity bill. This way, we get to extend the lives of our GPU and the CPU more, and may be we could wait for an upgrade a couple of years more than we do now. Electricity bills month over month, less upgrades, it's savings all the way.

What the hell does one need? An overclocked card for a certain game? Then okay, for playing such a game, just apply a profile and launch the game! And after the gaming session, again apply the underclock profile!

Look at this
http://archive.atomicmpc.com.au/forums.asp?s=2&c=7&t=9354,
and read the first para, and you come to know that if the clocks are just 100mhz less, then you have 36 watts less consumption.
(7900GTX = 650/1600
7900GT = 560/1500)

And so I thought the idle consumption of my GPU supposed to be an odd 60 to 70 watts, should become not more than 20 to 30 watts, may be less, since I lowered the clocks by 250/450 mhz, which is a big difference! It would mean that we have quite effectively reduced the idle power consumption of a GPU, using the overclock facility nowadays we have in cards' control panels! Though even after I underclock to such an extent, the memory still holds to 1800mhz (there are supposed to be 4x 900 memories in a 4870; i.e. 3600mhz). But then, my core goes down to 500 (like I said, automatically, for the desktop), much less than even a 7900 series. Like a 4870, all the 7900s are also 256 bits. http://www.nvidia.com/page/geforce_7900.html

I think people, especially the enthusiasts, hate to underclock the card even for desktop use. I think they don't bother to think that if the computer could run easily all the 2D stuff just with the onboard video RAM (especially these days), then we get a lot of power even after underclocking a dedicated GPU to minimum clocks. But not using such a wonderful facility should be sheer STUPID, whether or not this idea comes to our mind. Don't you think so? Matter of fact, if the idle power consumption that I thought was a bit extravagant did not bother me so much, even I would not have thought of this!

Thx
 

4745454b

Titan
Moderator
Whoa, hold up boss. You seem to be getting a few things confused.

May be all the people start underclocking their GPUs permanently for desktop use, and it becomes a factor in load shedding!

While Intel and AMD (CPU) were on the ball a lot earlier then the GPU companies were, they are all doing this now. We got cool n quiet/speedstep first, but 2d/3d clocks are now common for video cards. This means we are already doing this.

and you come to know that if the clocks are just 100mhz less, then you have 36 watts less consumption.
(7900GTX = 650/1600
7900GT = 560/1500)

Theres a bt more to it then that. Higher frequency does not automatically = higher power consumption. There are two ways to increase the frequency of a chip. The first is to increase the latency or stages of a circuit. While this can be effective, it lowers IPC. The other is to increase the voltage. This causes an increase in heat, faster circuit death due to electron migration, and as you've pointed out an increase in power consumption. The 7900GTX uses more power not because it has an extra 150MHz, but because Nvidia is supply more voltage to the 7900GTX then the 7900GT.

You can also save power by using a better PSU. A 75% efficient 500W PSU supplying 250W will use an extra (333 - 294) 39W to supply the same 250W. There is a reason why people look for 80+ units.

But not using such a wonderful facility should be sheer STUPID, whether or not this idea comes to our mind.

Again, it has come to mind, and they are working on it. I believe it is Enermax who has developed the first no output PSU. Current PSU will burn themselves out if there is no load on it. Intel and others are developing sleep states so deep, they will consume nearly no power. Trust me when I say that people are working on this.
 

HolyDoom Witch

Distinguished
Nov 24, 2008
85
0
18,630
Now this information seems useful, in that my next PSU is going to be... what is that... Enermax or the kind. Though their site http://www.enermaxusa.com/psu.php does not list any such feature, current or future. http://www.enermaxusa.com/news/index.php?action=7&PHPSESSID=37008022a368b898316960d74278805d

But still you are right, because the auto adjusting core clock in the 4870 is a proof, that people in this industry are now taking power consumption seriously. But I think they have JUST STARTED to think about it. It was like they got too blinded in making SUUUUUUUPER cards, that they created resource hogging expensive monsters, and suddenly they woke up one day.

If there is an intelligent power supply, that could, say, like zero down the wattage consumption, as and when all the computer processes usage goes completely down, then this would be an intelligent solution, since then it would be a centrally controlled power consumption for all units in the ATX cabinet. (If the PSU is going to have ICs, then its price is going to may be even double. Some or other how the money goes!) Though you would agree that this device would not be able to cut the power to individual devices, unless the individual devices cut themselves. (A PSU with such an ability would cost more than the computer itself.)

But I think this is still a good idea, to keep the graphics cards completely underclocked for the desktop, if it has an ability to rise to stock speeds automatically.

I estimate my power consumption for desktop right now to be between 20 to 30, say 25 watts. Considering most of the time the computer is at the desktop, this is a great setting for an HD 4870.

Thx