Power consumption - old power hungry display vs new 1080 display

Khe

Reputable
Sep 6, 2014
4
0
4,510
I have an old Acer AL732 which is listed as having an operational power consumption of 50 Watt with a resolution of 1280 x 1024.

If I were to replace it with a Viseo 243D with a resolution of 1920 x 1080 and listed as having an operational power consumption of 20 Watt; should I then expect my computer (all parts of it in total) to use less or more power in the following scenarios:


  • ■ browsing the web, editing documents etc. (i.e. "ordinary" PC use)
    ■ gaming at maximum resolution of each display
    ■ gaming at the same resolution on each display (e.g. 1280 x 1024 for the new display, if possible)

Obviously, the display itself should consume less power. But what about the graphics card and processor - will the extra pixels they have to account for mean that in sum, there is barely no gain - or even an increase in drawn power?

This might seem like an odd question, but I am really curious about this.
 
Solution
Very difficult to answer since I bet no one on here was involved in designing and testing the gpus and cpus you refer to in various different scenarios in order to tabulate power consumption.

As I've got a 50/50 chance of being right and the only chance of being proven wrong is if you buy a power meter and test befofe and after under various scenarios, I'll guess that you will save power and it will be negligible in the grand scheme of things.

Khe

Reputable
Sep 6, 2014
4
0
4,510



So, just to be clear, you estimate that the extra power used, for many purposes, will be much less than the 30 W difference between the two monitors?

I assume that a change in resolution for gaming should give a significant increase in power consumption.
 


I don't know about "much" less. 30W is pretty insignificant.
 

Khe

Reputable
Sep 6, 2014
4
0
4,510


Maybe I did not formulate my question clearly enough. My question was weather I should expect the increase in power consumption because of the extra pixels to cancel out the 30 W gain by switching to the more power efficient monitor.

Yes, it is a small amount of power (though over a longer period of time, it could make a significant difference if e.g. the local electricity prices are high enough; so it is not a completely theoretical question).
 


Yes, you would expect a small drop in total power.
If you used this 8 hours a day, and the power consumption saving is 20W, you would save 58.4 kW over a year.
At 30 cents a kilo-watt, that is $17.52.
 

leeb2013

Honorable
Very difficult to answer since I bet no one on here was involved in designing and testing the gpus and cpus you refer to in various different scenarios in order to tabulate power consumption.

As I've got a 50/50 chance of being right and the only chance of being proven wrong is if you buy a power meter and test befofe and after under various scenarios, I'll guess that you will save power and it will be negligible in the grand scheme of things.
 
Solution

Khe

Reputable
Sep 6, 2014
4
0
4,510


Yeah, I guess there is not much hope of getting a definite answer without actually testing it out in practice. Thanks for the input, folks.