Sign in with
Sign up | Sign in
Your question
Solved

Confused Needing help to clarify power management

Last response: in Components
Share
April 26, 2014 3:41:16 AM

So hi again everyone.

I finally got my power meter. All these numbers are based off my current rig info in my profile.

Running on idle with HDMI plugged meter indicates and average of 120W for PC and 80W for my LCD tv.

I am running my phenom 2 x6 restricting CPU power from 800-1600 range. Changing that to full range adds about +10-15W idle. But its not really needed unless i really have to use my full CPU. HW monitor indicates idle usage of 67W and using this config it jumps to 97 sometimes no problem there.

During normal usage, streaming etc the power change shift by +20W nothing big.

But here is the issue i am really confused, i have an HD 6790 which everywhere mentioned has a max TDP of 150W. When i run 3dmark or play i see a jump of +90W on the meter which make my idle and normal usage about 60W constantly.

So my system is eating away and average of 120W-140W normal usage and average 230W under pressure. I am not exactly a maths expert and not top tier pc parts power analysis guy. Based on this info is my PC power consumption normal if my CPU is taking 67W,my GPU is taking 40-60W idle and light usage?

If i changed my GPU to an HD 7770 what would be the changes in idle, normal use and stress use.



a c 129 U Graphics card
April 26, 2014 3:49:32 AM

Those figures don't seem out of the normal. The TDP is DesignPower - includes overclocking at full load.

See http://www.hwcompare.com/11892/radeon-hd-6790-vs-radeon...

Hd7770 has a tdp of only 80W, So Idle would hardly change, Gaming (normal use) would be about 20W less and stress testing would be about 50W less
m
0
l
April 26, 2014 9:00:54 AM

Ok so now it seems that i've discovered an issue thats making my GPU constantly output on multiview eating 30W more than normal permanently.

Correct me if i am mistaken. However i have no clue how to fix that. My display is on HDMI.

Honestly it does feel like my GPU is outputting more than intended. Idle temp doesn't feel right sitting at 46degrees. And this constant 40W power draws got me totally confused. And i am sure i have done everything right, even in display detection theres only 1.

Help appreciated from graphic card experts. THanks.
m
0
l
Related resources
April 26, 2014 3:51:26 AM

i7Baby said:
Those figures don't seem out of the normal. The TDP is DesignPower - includes overclocking at full load.

See http://www.hwcompare.com/11892/radeon-hd-6790-vs-radeon...

Hd7770 has a tdp of only 80W, So Idle would hardly change, Gaming (normal use) would be about 20W less and stress testing would be about 50W less


So you are telling me that an HD 7770 would idle and work at same power rate as the HD 6790? doesn't sound quite right.
m
0
l
a b U Graphics card
April 26, 2014 9:39:35 AM

Here they got your gpu to a max. of 125watts of usage,
http://www.techpowerup.com/reviews/AMD/HD_6790/20.html

There's also a difference between what a part will use and what is measured at the wall socket since efficiency needs to be applied too.If your card would use 150watts would that be increased at the wallsocket because of this.
The efficiency rating of your psu needs to be applied in reverse so if the card uses 150watts and the psu is rated for let's say bronze and it does good and works at 85% efficiency would that be,
150:85X100=176.5watt at the wallsocket.Maybe the math isn't entirely correct,but you get the point.
With this a i mean to say that you can't just use tdp and usage at the wallsocket in the same calculation.
m
0
l
a c 129 U Graphics card
April 26, 2014 5:55:18 AM

No - I said same at idle but LESS at normal use and stress testing
m
0
l
April 26, 2014 10:28:41 AM

Vic 40 said:
Here they got your gpu to a max. of 125watts of usage,
http://www.techpowerup.com/reviews/AMD/HD_6790/20.html

There's also a difference between what a part will use and what is measured at the wall socket since efficiency needs to be applied too.If your card would use 150watts would that be increased at the wallsocket because of this.
The efficiency rating of your psu needs to be applied in reverse so if the card uses 150watts and the psu is rated for let's say bronze and it does good and works at 85% efficiency would that be,
150:85X100=176.5watt at the wallsocket.Maybe the math isn't entirely correct,but you get the point.
With this a i mean to say that you can't just use tdp and usage at the wallsocket in the same calculation.


Yes i do get the point. But its not about the power usage on load. It's about the idle power draw.
Normally according to every becnhmark i have seen. This card should idle under 40degrees and about 15 watts. In the past when i was using multiple display i noticed it and i forgot if i ever asked for help about this.
I had my main on DVI and my sub on HDMI my idle temp was around 35degrees with HDMI disconnected. Even when powered off, just connecting the hdmi cable to the subdisplay brought it to the state that it is right now, idling at 45degrees and obvious additional power draw.
I am wondering if it is a hardware defect that is forcing multiview power draw through HDMI even though only 1 display is connected.
m
0
l
April 26, 2014 6:15:02 AM

i7Baby said:
No - I said same at idle but LESS at normal use and stress testing


Hmm ok now i see, i went and checked some stuff. Now i understand. So basically whatever the situation on these tier of cards the idle power draw is about the same, maybe 5-7 watts difference. The only difference comes from being used.

Thanks now thats 1 thing clearer. But still getting me confused. I am still looking and this is killing me.

http://www.techpowerup.com/reviews/asus/hd_7770_directc...

If i follow this chart, it looks like my card is idling on multimonitor rather than single monitor. Don't you think?
Because see if my average idle system is around 120watts. my CPU is set for 67W-97W only power draw then it means that my card is actually not working normally?
m
0
l
a b U Graphics card
April 26, 2014 10:54:33 AM

I can't really say,maybe that a driver update will help with this or clocking the card down via the ccc or a setting in the same ccc..
m
0
l
April 26, 2014 11:10:14 AM

Vic 40 said:
I can't really say,maybe that a driver update will help with this or clocking the card down via the ccc or a setting in the same ccc..


I have tried everything a non-expert can think of lol. i am fairly knowledgeable but i have no clue at all as to why a card that is supposed to idle at 15 W is idling at 40 W on HDMI but not on DVI or DSUB. My guess is its a hardware defect, not very bad but for people who actually look into it like me it is quite disturbing lol. Only graphic card experts would be able to work that one out.
m
0
l
a c 546 U Graphics card
April 26, 2014 12:20:13 PM

The Radeon HD 6790 has an HDMI sound chip that would draw more power if HDMI is being used.
m
0
l
April 26, 2014 12:31:29 PM

ko888 said:
The Radeon HD 6790 has an HDMI sound chip that would draw more power if HDMI is being used.


REALLY!!!!! honest -.- so it is standard on the hardware thats kinda "wow wth i didn't have a clue all this time", and if i changed this card to and HD 7770 the power draw would become normal on HDMI or that too has a sound chip that takes up power unnecessarily?
m
0
l
a c 546 U Graphics card
April 26, 2014 12:45:40 PM

Lascar said:
ko888 said:
The Radeon HD 6790 has an HDMI sound chip that would draw more power if HDMI is being used.


REALLY!!!!! honest -.- so it is standard on the hardware thats kinda "wow wth i didn't have a clue all this time", and if i changed this card to and HD 7770 the power draw would become normal on HDMI or that too has a sound chip that takes up power unnecessarily?


The Radeon HD 7770 GHz Edition also has an HDMI sound chip. The chip is essentially standard on any current graphics card that has an HDMI port.
m
0
l
April 26, 2014 12:51:47 PM

ko888 said:
Lascar said:
ko888 said:
The Radeon HD 6790 has an HDMI sound chip that would draw more power if HDMI is being used.


REALLY!!!!! honest -.- so it is standard on the hardware thats kinda "wow wth i didn't have a clue all this time", and if i changed this card to and HD 7770 the power draw would become normal on HDMI or that too has a sound chip that takes up power unnecessarily?


The Radeon HD 7770 GHz Edition also has an HDMI sound chip. The chip is essentially standard on any current graphics card that has an HDMI port.


I meant does it also take up 25watt, or is it an improved chip with less power consumption, i mean on idle you do nothing no audio and stuff. How about on the R7 series? But still the issue stays, that all benchmarks are ran on 1080p, which they use HDMI, so how come they get idle 15 and me 40?
m
0
l

Best solution

a c 546 U Graphics card
April 26, 2014 1:40:28 PM

Lascar said:
I meant does it also take up 25watt, or is it an improved chip with less power consumption, i mean on idle you do nothing no audio and stuff. How about on the R7 series?


25 Watts would burn out an HDMI sound chip.

Do you experience the same power difference when using the HDMI port on your motherboard compared to its DVI port when using he motherboard's integrated graphics?

The Radeon R7 2xx cards implement ZeroCore Power Technology that cuts graphics card power consumption when the OS tells the monitor to go into sleep mode.

ZeroCore Power Technology was also available in the Radeon HD 7770 GHz Edition and would cut the idle power consumption down to 3 Watts from the normal 7 Watts when ZeroCore kicked in.
Share
a c 129 U Graphics card
April 26, 2014 4:08:10 PM

If you were really worried about power draw, you might consider updating the cards you've got to the latest generation. eg GTX 750Ti - better graphics, lot less power draw. Google it
m
0
l
a c 546 U Graphics card
April 26, 2014 4:26:13 PM

Nothing comes close to the GeForce GTX 750 Ti's performance per watt.
m
0
l
April 27, 2014 3:01:27 AM

I am not that worried about power draw, just that it is weird.

You don't find it weird?
m
0
l
April 27, 2014 3:21:29 AM

ko888 said:
Lascar said:
I meant does it also take up 25watt, or is it an improved chip with less power consumption, i mean on idle you do nothing no audio and stuff. How about on the R7 series?


25 Watts would burn out an HDMI sound chip.

Do you experience the same power difference when using the HDMI port on your motherboard compared to its DVI port when using he motherboard's integrated graphics?

The Radeon R7 2xx cards implement ZeroCore Power Technology that cuts graphics card power consumption when the OS tells the monitor to go into sleep mode.

ZeroCore Power Technology was also available in the Radeon HD 7770 GHz Edition and would cut the idle power consumption down to 3 Watts from the normal 7 Watts when ZeroCore kicked in.


Nice explanation. From all this i am pretty sure now that my card has a hardware bug that makes it output multiview on HDMI. Because normally taht should have been about 15watts consumption, but it is about 40W all the time. Rated average for multiview output is about 45 watts at idle which would explain that.
Also i was asking all that because i will change cards but i ain't buying gtx750Ti and kill off my 650w PSU. I don't game a lot some gaming at 720p that's it. R7 250x or hd 7770 it is.

Thanks for the help. Clarified much. It must be that my graphic card is defective somehow. It has been running for almost 3 years it's about time.
m
0
l
a c 129 U Graphics card
April 27, 2014 5:15:00 AM

The gtx 750ti only needs 20a on the 12v rail and minimum 400w for the system. So your 650W will do it easy.
m
0
l
April 27, 2014 5:27:33 AM

i7Baby said:
The gtx 750ti only needs 20a on the 12v rail and minimum 400w for the system. So your 650W will do it easy.


Sorry i meant i am not going to waste 50 bucks and PCI-E power lines for 10 fps. When i can grab and hD 7770 or R7250x for 100bucks and get almost the same performance.

I am gaming at 720p medium i dont need AA so i think it should be quite enough.
m
0
l
a c 129 U Graphics card
April 27, 2014 5:41:30 AM

OK then why all the concern about power consumption?
m
0
l
April 27, 2014 5:50:12 AM

i7Baby said:
OK then why all the concern about power consumption?


To find out issues, that may or may not affect the proper functioning of my rig.
Finding out that my GPU is using up more that its intended is something of concern, it might blow up anytime who knows.

As for the power consumption, if my rig is taking up more power than intended like this obvious additional 30 watts draw where it shouldn't. There may be something wrong with my mobo, my card, my psu. Don't you agree that finding out the source of the issue and fixing it is a good way to mitigate the risk of future problems?
m
0
l
!