GeForce RTX 4060 May Consume More Power Than an RTX 3070

GeForce RTX GPU
GeForce RTX GPU (Image credit: Shutterstock)

Nvidia's upcoming mid-range GeForce RTX 40-series (Ada Lovelace) graphics cards might be free from the shackles of high power consumption. Prominent hardware leaker kopite7kimi (opens in new tab) believes that the GeForce RTX 4060 will consume more power than Nvidia's current generation GeForce RTX 3070, far more than the GeForce RTX 3060 12GB that it's replacing. But the real question is; how much more power are we talking about? Even though kopite7kimi has a solid record, we still recommend you treat the rumor with caution.

Unfortunately, the leaker doesn't know the exact power specifications of the GeForce RTX 4060, so it's entirely an educated guess as to what it'll be like. However, evidence of the new 16-pin power connector and GeForce RTX 3090 Ti alone shows that the next generation of Nvidia GPUs will consume far more power than the current Ampere generation.

The GeForce RTX 3070 consumes around 220W on the Founders Edition to quickly refresh your memory with factory-overclocked SKUs boosting up to around 240-250W. The GeForce RTX 3060 12GB and GeForce RTX 3060 Ti consume 170W and 200W, respectively, with higher-end AIB partner cards consuming slightly higher than those values.

See more

There's a possibility that the GeForce RTX 4060 may consume anywhere between 290W to 350W which is where the GeForce RTX 3070 Ti and GeForce RTX 3080 sit in terms of power consumption. We've already seen reports of flagship GeForce RTX 40-series cards possibly consuming as much as 450W to 600W of power (which is also evident by the GeForce RTX 3090 Ti's power budget). So technically, there's plenty of power headroom in Nvidia's future lineup for a mid-range 350W GPU.

It will be interesting to see how the market handles a 300W to 350W mid-range graphics card in the future. In our current market, mid-rangers like the GeForce RTX 2060 and GeForce RTX 3060 and AMD's Radeon RX 6000-series counterparts all sit well within the 200W power bracket, with some featuring power budgets as low as 170W.

A jump from 300W to 350W in a single generation could pose a serious problem to budget gamers who want to either build a new system or upgrade their system with a new GPU. In addition, the 50% to 75% bump in power will undoubtedly require higher wattage PSUs and higher quality units that are more expensive for users who will not expect a significant jump in next-generation GPUs.

Additionally, bigger and more airflow-oriented computer cases might become necessary to fit a GPU with a cooler designed to handle 300W of power output. In the end, many gamers might need to upgrade additional components for a single GPU upgrade alone.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • thisisaname
    One thing I think we can be sure on these are not going to launch or be available anytime soon. Even if the 4090 launched today it is going to be a good 6months before we see this card.
    Reply
  • hasten
    I dont know if the psu requirement is going to be a problem seeing how often I read "is this 1300w titanium enough to drive my 5600x and 2070 super?" on forums...
    Reply
  • Giroro
    Case design is going to have to change. Plus, say goodbye to the idea that a computer with good performance can be compact or quiet.
    Reply
  • derekullo
    hasten said:
    I dont know if the psu requirement is going to be a problem seeing how often I read "is this 1300w titanium enough to drive my 5600x and 2070 super?" on forums...
    The way I have always bought power supplies was one that was 50% higher than the maximum load of the entire computer.
    Reply
  • salgado18
    That might be true. The quest for more performance is always hindered by power consumption, so one way to increase performance in the next generation is to throw more power into it. Worked for Intel, right? But it also means that we could "upgrade" to inferior-numbered cards to keep the power while increasing FPS.
    Reply
  • KananX
    Hate to say it, but… told you so. If high end is extreme on power , mid range won’t be mild either.
    Reply
  • InvalidError
    Giroro said:
    Plus, say goodbye to the idea that a computer with good performance can be compact or quiet.
    No need to say goodbye, you just need to adjust your expectations.

    Game developers will always put the bulk of their effort in making sure their games play and look good enough on the lowest common denominator (recommended spec) they want to officially support to avoid alienating a large chunk of their potential market. If gamers start rejecting overpriced power-hungry GPUs, there will be increased focus on maintaining "good performance" (the devs' opinion, may differ from yours) on older/lower-end hardware.
    Reply
  • Phaaze88
    Just run the AC for a bit longer, or lower its setting a bit. Electricity is cheap. /s

    Some may start caring when the room temperature goes up 5C or so after about an hour of playtime.

    Giroro said:
    Case design is going to have to change.
    If the PC is still in the same room with the user, what would a new design change with the energy still being released in one's room?
    Reply
  • aalkjsdflkj
    I suspect I'm in the minority, but I just won't buy a GPU that has high power consumption numbers. I'm very sensitive to noise and my computer room gets direct sunlight in the afternoon and evening, so I need to keep temperatures down as well as noise. Hopefully the rumors that AMD's upcoming GPUs will be efficient are true, because there's no way I'm buying anything like a 300+W GPU.
    Worst case, I'll just undervolt, turn on frame-rate limits, or find some other way to run a power hungry GPU a little more efficiently.
    Reply
  • KananX
    Simply don’t buy it and don’t support it if you’re not fine with GPUs using over 400 or over 500W of power. If you ask me, 350W is already a lot, and now they don’t care anymore. They’ve crossed a line.
    Reply