Sign in with
Sign up | Sign in
Your question

Radeon HD5870 Underclocking

Last response: in Graphics & Displays
Share
January 11, 2010 7:16:51 PM

I have hd5870 in crossfire and I was wondering if lowering the voltage will decrease my temps? I dont want to lower my core clock or memory because I dont want to lose performance, just the voltage. My temp on card one gets up to 87C with fan speed up to 40% cus I have hardly any room between the 2.

So will lowering the voltage decrease my temps? and will my performance stay the same?
a c 217 U Graphics card
January 11, 2010 7:47:38 PM

I found some good results with Overclocking the core and underclocking the memory. I gained performance and lost heat.
m
0
l
January 11, 2010 9:24:04 PM

Well what would you put the core and memory clock at to gain performance and lose heat in crossfire.
m
0
l
Related resources
a c 217 U Graphics card
January 11, 2010 9:59:08 PM

I decided to run a little test to see how my configuration compares on 3Dmark Vantage, just as an example of how it compares under default settings.

In order to ensure that I didn't give my setup an advantage, I ran the default settings first, so the GPU was likely a little hot before running my current settings.

Default settings of 850/1200 got to a max of 73C at the end of 3Dmark Vantage. It started at a temp of 34C. The GPU score 17351.

I changed the settings to my current settings of 900/1150 and started at a temp of 36C because it was taking too long to go back down to 34C. It reached a max of 71C at the end of 3Dmark Vantage. The GPU score was 17953.

Recap. Stock settings scored 17351 with a max temp of 73C
900/1150 settings scored 17953 with a max temp of 71C

You could try lowering the core speeds a little to find lower temps at which perform closer to stock settings.
m
0
l
a c 1411 U Graphics card
January 11, 2010 10:02:12 PM

mstang783 said:
I have hd5870 in crossfire and I was wondering if lowering the voltage will decrease my temps? I dont want to lower my core clock or memory because I dont want to lose performance, just the voltage. My temp on card one gets up to 87C with fan speed up to 40% cus I have hardly any room between the 2.

So will lowering the voltage decrease my temps? and will my performance stay the same?

If you lower the voltage without reducing clocks you will generate more heat! The reason is you will still need the same Power Watts=amps x volts, so if the volts go down the amps go up which means more heat!
m
0
l
January 11, 2010 10:12:13 PM

rolli59 said:
If you lower the voltage without reducing clocks you will generate more heat! The reason is you will still need the same Power Watts=amps x volts, so if the volts go down the amps go up which means more heat!


You're assuming you actually do need the same amount of power. That's not the case. Electrical circuits usually have some leeway on how much power they actually need in order to function.

Edit: Also, you should say that Power=volts*amps, since power is not a vector, and both current and voltage can be; though it's strange for current to be one in the cases where it is. Just being precise, I guess.
m
0
l
a c 1411 U Graphics card
January 11, 2010 10:21:53 PM

frozenlead said:
You're assuming you actually do need the same amount of power. That's not the case. Electrical circuits usually have some leeway on how much power they actually need in order to function.

Edit: Also, you should say that Power=volts*amps, since power is not a vector, and both current and voltage can be.

I am an engineer I work with electricity all the time! We are only talking DC electricity here not a complicated 3phase system. If you don't reduce the power needed by the User (GPU) and change the Volts there is going to be a change in amps!
m
0
l
January 11, 2010 10:55:52 PM

rolli59 said:
I am an engineer I work with electricity all the time! We are only talking DC electricity here not a complicated 3phase system. If you don't reduce the power needed by the User (GPU) and change the Volts there is going to be a change in amps!


I'm an engineer also, sir; you too should understand the need for precision in statements like that. True, the card uses the same amount of power - but there is a tolerance to this value - as there is with all working components, which you should also know. By lowering the voltage, you can attempt to exploit this.
m
0
l
a c 1411 U Graphics card
January 11, 2010 11:14:06 PM

I was referring to OP's question about whether lowered voltage would reduce heat with std clocks. The simple answer is no! I just explained it a little regardless of tolerances! But what I said is correct!
m
0
l
a b U Graphics card
January 11, 2010 11:22:22 PM

Why would the GPU power be a constant??

Reducing the voltage going to the core should LOWER the power consumption, the current will go down proportionately..

Come now folks :p 

I_D = k((V_{GS}-V_{tn})V_{DS}-(V_{DS}/2)^2)

You are implying that increasing voltage would then decrease current in a card, which is not the case, current increases with voltage and this is why we adjust voltage when we try to overclock, if it is too low the current will not be sufficient to drive the IC.. Are you sure you are not thinking about the supply power (which no one should ever mess with) instead of the Voltage programed in the Bios to supply through the Vregs?

Lowering Voltage will lower power requirements and reduce heat output OP.. Mind you, you need to edit the bios to change the voltage and any drop might cause instability.

Actually, to put it in perspective.. the power dissipated by the IC is proportional to the square of the voltage, so raising or lowering it makes a huge impact on heat and power consumption.
m
0
l
a c 1411 U Graphics card
January 11, 2010 11:32:54 PM

Power is demanded by the user in this case the GPU if you increase the volts you reduce the amps. It is no different from psu if it is pulling 460watt out of the wall on 115volts(US) it is using 4 amps if it is on 230volts(Europe) it is only using 2amps. (Looking away from all losses and tolerances)
m
0
l
a b U Graphics card
January 11, 2010 11:38:08 PM

We are talking about the GPU itself, The power that it needs to draw is not constant.. Why on earth would it be?

The current the VRM supplies to the IC (GPU in this case) is determined by the bios.. and is equal to the math i posted above.

If you reduce the Bios dictated voltage to the GPU you reduce the power that the GPU (and the card itself) runs at. You also reduce the current which increases the chances of instability, while at the same time reducing temperatures.
m
0
l
a c 217 U Graphics card
January 11, 2010 11:38:39 PM

I do know that in the OC forum, this is commonly done for CPU's, although they end up raising voltage as the higher your OC, the more voltage that is needed to remain stable.

Anyways, ATI tool can do this, and RivaTuner probably does. The downside of both of these tools is that the idle timings will not drop like they will using ATI Overdrive. This might end up causing more heat issues in the long run.
m
0
l
January 11, 2010 11:51:23 PM

daedalus685 said:

If you reduce the Bios dictated voltage to the GPU you reduce the power that the GPU (and the card itself) runs at. You also reduce the current which increases the chances of instability, while at the same time reducing temperatures.


The chances of instability are dictated by the tolerances of the particular card you have for any given load level. I'm not too good at explaining things, I guess.
m
0
l
a b U Graphics card
January 11, 2010 11:55:42 PM

frozenlead said:
The chances of instability are dictated by the tolerances of the particular card you have for any given load level. I'm not too good at explaining things, I guess.


Come again? I'm not sure what that has to do with my post. I mentioned it to point out that lowering voltages might be worth while.. it is by no means certain...

Also, the chances of instability are not only dictated by the initial tolerances of the IC.. but by age, binning etc.. But isn't all of that blatantly obvious? Some chips may be set at 1.5V with a tolerance of .5 so you are safe to .5, or maybe even beyond.. but some are not.. and there is no way to know what the tollerances are, even if you know your failure point.
m
0
l
January 12, 2010 12:15:17 AM

That's what I mean - if you get a tolerant chip that can stand performing at a lowered voltage, you'll get a resulting lowered temperatures with no (or small enough to be considered statistical error) loss in performance. Isn't that what this thread's about? I'm not saying you can know the tolerances...I'm just saying you may exploit the possibility of them by lowering your voltage and testing it; the result being essentially the same performance with lower temperatures.
m
0
l
a b U Graphics card
January 12, 2010 12:23:20 AM

frozenlead said:
That's what I mean - if you get a tolerant chip that can stand performing at a lowered voltage, you'll get a resulting lowered temperatures with no (or small enough to be considered statistical error) loss in performance. Isn't that what this thread's about? I'm not saying you can know the tolerances...I'm just saying you may exploit the possibility of them by lowering your voltage and testing it; the result being essentially the same performance with lower temperatures.


Yes yes, but the other chap was contesting that the current would increase if the voltage dropped assuming the power product was a constant, it is not.

I attempted to correct him. I'm a physicist.. not too familiar with tolerances and what not.. lol. Thus no need to bring them up as the situation can be explained without them ;) . Lower voltage results in lower power, and heat (though a lot of the heat comes from the waste power in the clock cycling and is thus still there at any voltage with a constant clock rate). I mention the possible crashing and what not as it is a side effect and possibility of reducing the drive voltage. While the tolerances in the design allow you to under volt and still get a working GPU.. I don't often think of it in that way.. At any rate, that was not my point of contention at all. It was the statement of increasing current that I wished to correct, you are entirely correct with the talk of tolerances.
m
0
l
a c 1411 U Graphics card
January 12, 2010 12:26:36 AM

daedalus685 said:
Yes yes, but the other chap was contesting that the current would increase if the voltage dropped assuming the power product was a constant, it is not.

I attempted to correct him. I'm a physicist.. not too familiar with tolerances and what not.. lol. Thus no need to bring them up as the situation can be explained without them ;) . Lower voltage results in lower power, and heat (though a lot of the heat comes from the waste power in the clock cycling and is thus still there at any voltage with a constant clock rate). I mention the possible crashing and what not as it is a side effect and possibility of reducing the drive voltage. While the tolerances in the design allow you to under volt and still get a working GPU.. I don't often think of it in that way.. At any rate, that was not my point of contention at all. It was the statement of increasing current that I wished to correct, you are entirely correct with the talk of tolerances.

Here is the other chap. I presume that you recognize Ohms law as a physicist!
m
0
l
a b U Graphics card
January 12, 2010 12:29:39 AM

sigh.. yes I do.. You do not understand how a GPU works though..

I already linked the equation governing the current on an IC (since it is not ohmic it is not ohms law by the way).. what more do you want?

Beyond that... V=IR, if V goes down and R is a fundamental property of the circuit you expect I to increase?? What was that about ohms law??

I assume you are trying to refer to joules law of W=IV... but mistakenly assuming W is a constant, and forget we are dealing with semiconductors.
m
0
l
January 12, 2010 12:34:19 AM

rolli59 said:
Here is the other chap. I presume that you recognize Ohms law as a physicist!


Ohm's law only holds for Ohmic materials...

daedalus is correct, however - your issue is in your assumption that the power of the card must remain the same - it doesn't.

Edit: Yes...on second thought..wasn't this about the power equation?
m
0
l
a c 1411 U Graphics card
January 12, 2010 12:40:35 AM

daedalus685 said:
sigh.. yes I do.. You do not understand how a GPU works though..

I already linked the equation governing the current on an IC.. what more do you want?

Beyond that... V=IR, if V goes down and R is a fundamental property of the circuit you expect I to increase??

I know that I simplified the circuitry. I know as well that the GPU is not under constant load but it will need a minimum wattage to function and that is always a multiplication of volts and amps.
m
0
l
a b U Graphics card
January 12, 2010 12:45:05 AM

rolli59 said:
I know that I simplified the circuitry. I know as well that the GPU is not under constant load but it will need a minimum wattage to function and that is always a multiplication of volts and amps.


Graaaaa

Load has nothing to do with it, the peak power consumed by the GPU is not a constant., it is dependent on the square of the voltage (which is proportional to the current as shown earlier), capacitance, and frequency. There is a floor where the IC will not work.. this is because the switching time will be longer than the period in the clock rate.. It has nothing to do with minimum power, it has to do with lost bits due to switching failure (errors).

The total power consumed by the card is indeed volts times amps at the power plug.. But if you reduce the voltage to the GPU core, the power drawn goes DOWN, thus the current at the line drops, the voltage is always at 12V coming into the card.
m
0
l
!