High end vs low end power consumption

Erchz

Reputable
Mar 6, 2015
4
0
4,510
High end GPU/CPU have a higher power consumption since they are more powerful, but does that mean that when they face similar tasks lower end would they consume more than the lower end counterparts?

Eg: Some non demanding game with low-mid settings that a 840m would be able to run.
Would a 980m need more power (talking about Wattage because im interested in the power draw and heat) to run it at the same low-mid specs?
Would it need less? Or there would be no significant difference?
 
Solution
Based on my own monitoring of my cards, I'd say the bigger the chip the more power it uses. Thou with modern and smart power gating the difference is growing smaller.

So a small chip that can a run a game at say 60fps at 100% load will use less watts than a chip twice the size (with the same architecture!) at 50% load IF the clocks and voltage are the same. The power consumption can be divided into different parts, one part is caused by the leakage currents that depend on the chip size etc and only one part depends on the load % itself.

The situation changes if the big chip can stay at lower clocks to do the task at hand, in which case it might be more energy efficient... :D

Also if you're doing gpu encoding or something like that...

leeb2013

Honorable
Greater performance doesn't mean higher power usage. Eg intel haswell cpus perform better that amd fx cpus yet use less power. It's down to the cpu architecture and silicon fabrication size.

Additionally modern cpus have many power saving features including reducing clock speeds.
 

Erchz

Reputable
Mar 6, 2015
4
0
4,510


First of all, thank you for your answer but Im not talking about different architectures.
Take another example. You have a 970m and a 980m video card, the difference is the number of active cores and potencial performance, would they have similar power draw when running mildly demanding applications?
 

Kari

Splendid
Based on my own monitoring of my cards, I'd say the bigger the chip the more power it uses. Thou with modern and smart power gating the difference is growing smaller.

So a small chip that can a run a game at say 60fps at 100% load will use less watts than a chip twice the size (with the same architecture!) at 50% load IF the clocks and voltage are the same. The power consumption can be divided into different parts, one part is caused by the leakage currents that depend on the chip size etc and only one part depends on the load % itself.

The situation changes if the big chip can stay at lower clocks to do the task at hand, in which case it might be more energy efficient... :D

Also if you're doing gpu encoding or something like that where the big chip gets the job done quicker, it probably will end up consuming less energy in total for the given task.

970m and 980m are based on the same chip and even those disabled units on the 970m could consume/leak power... not sure how they've actually disabled them, but tradiotinally the crippled one has always been the less energy efficient one (if the clocks and volts are close to each other, sometimes the full chip has been set at really high clocks and needs higher volts to hit those clocks and so the efficiency suffers)
 
Solution

TRENDING THREADS