RTX 4080 and RTX 4070 Power Consumption Dropped By Up to 30%

16-pin PCIe Power Connector, Asus RTX 3090 Ti
(Image credit: Tom's Hardware)

Resident Twitter leaker @kopite7kimi updated the public with new power consumption expectations for the upcoming RTX 4080 and RTX 4070. The wattage of the RTX 4070 has gone down a bit, to 285W from a previously projected 300W. The RTX 4080's power consumption target meanwhile plummeted to just 320W, matching the RTX 3080's official TBP. Previous rumors suggested it might have a 450W TBP for the xx80 series GPU.

We don't know why the power expectations have gone down so unexpectedly, especially with the RTX 4080. However, Kopite posted another tweet around the same time, saying the high 40-series power consumption has become problematic, resulting in a reduction of power output to Ampere levels. Presumably, clock speeds and performance may have been adjusted as well.

We don't know Kopite's sources directly, but we can presume from this data that Nvidia's engineers have been desperately trying to get these rumored ultra-high power consumption figures to work on the RTX 4080 and RTX 4090 but found the TSMC 5nm silicon didn't behave as desired at these extreme watts and amps.

As usual, apply liberal helpings of salt as there's no official word on any of this stuff yet. Kopite does have a strong track record for being an accurate leaker, however, and there have been leaks of Ada's high power consumption running for a long time. Some of that might be speculation thanks to the introduction of the new 600W 16-pin power connector, and there's still a chance Nvidia's 40-series GPUs will go back up in power consumption.

The lowered power consumption could be good or bad news for gamers. On one side, the lower wattage will reduce the TBP of the top-tier GPUs by a considerable amount, making them much easier to cool and thereby reducing system heat and noise levels as well.

On the other side, a 29% reduction in power use on the RTX 4080 could substantially reduce GPU performance. That's the problem with early rumors and leaks. Nvidia is also working to determine the best clock speeds, voltages, and power levels for its new Ada architecture, so everything is still in a state of flux.

That's part of why we've long been skeptical of claims that Nvidia's Ada Lovelace GPUs could use as much as 600W of power. That's a massive increase compared to Ampere. There will undoubtedly be some custom designs that push the new GPUs to the limit, but just because the new PCIe 5.0 12VHPWR 16-pin connector can pull up to 600W doesn't mean cards that use it will draw that much power. We should get official confirmation of the RTX 40-series GPU specs in the near future when it's officially announced, judging by the increasing rate of leaks happening right now.

Swipe to scroll horizontally
RTX 4080 and RTX 4070 Power Consumption Changes
ModelCurrent Power EstimatePrevious Power EstimatePercentage Drop
RTX 4080320W450W29%
RTX 4070285W300W5%
Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • bigdragon
    Note for the writer/editor: the 4070 watt columns are reversed.

    I'm not sure how this is unexpected. Nobody liked the high power consumption numbers rumored for the 40-series. Sales would be hurt if gamers had to buy a new power supply to go with each 40-series card. Nvidia doesn't need a problem wholly within their control scaring away customers. Definitely makes sense that there would be a smaller bump to power consumption instead of the ridiculous jump previously rumored.

    The important number for me is VRAM. Hopefully Nvidia isn't stingy about VRAM this time around. I want to see a minimum of 16GB on the 4080 and a minimum of 12GB on the 4070 -- preferably also 16GB there.
    Reply
  • JarredWaltonGPU
    bigdragon said:
    Note for the writer/editor: the 4070 watt columns are reversed.

    I'm not sure how this is unexpected. Nobody liked the high power consumption numbers rumored for the 40-series. Sales would be hurt if gamers had to buy a new power supply to go with each 40-series card. Nvidia doesn't need a problem wholly within their control scaring away customers. Definitely makes sense that there would be a smaller bump to power consumption instead of the ridiculous jump previously rumored.

    The important number for me is VRAM. Hopefully Nvidia isn't stingy about VRAM this time around. I want to see a minimum of 16GB on the 4080 and a minimum of 12GB on the 4070 -- preferably also 16GB there.
    Oops, I fixed the 4070 power columns. But yeah, I've long been skeptical of the 450-600W power claims for the 4080 and 4090. Custom cards will probably hit those levels, but I suspect the Founders Edition will be far more reasonable.

    Part of me wonders if it wasn't all a disinformation campaign. Get people angry about 600W cards, then announce 450W cards and everyone is happy. Where 3090 Ti was "Wow, this uses a ton of power!" the 4090 will be, "Hey, this only needs 450W, not 600W. Awesome!"
    Reply
  • thisisaname
    Or their guess was wrong and they have put out a lower guess, put out enough and you going to be right sometime.

    So many leaks and disinformation and they wonder why the sale of current cards have fallen so much in the last few months. Who wants to but now when the new generation is going to be so much greater.
    Reply
  • PiranhaTech
    I lean towards this being good news. It doesn't feel like as much of a technology improvement if the next generation has a huge TDP jump. I'm not crazy about the amount of heat my PC puts out when gaming, and my setup is maybe mid tier.

    There more than likely is improved technology, especially with Nvidia, but when the TDP jumps that far, it feels like what AMD had to do with Bulldozer and Jaguar.
    Reply
  • King_V
    Yeah, I think that a disinformation campaign is possible, though it seems kind of risky at best.

    That said:
    but we can presume from this data that Nvidia's engineers have been desperately trying to get these rumored ultra-high power consumption figures to work on the RTX 4080 and RTX 4090 but found the TSMC 5nm silicon didn't behave as desired at these extreme watts and amps.

    Ok, see, I am glad they're bringing the power consumption down, but if it was because they wanted to push the silicon to its limits, but simply couldn't, then this isn't exactly virtuous.

    "We wanted to consume gobs of power to get the last couple of percent of performance, but we couldn't, so, now we're forced into doing something more responsible" isn't exactly a ringing endorsement of any kind of good intent.

    Ok, Nvidia, you're doing better, but grudgingly so, so I'm still giving you the stink-eye.
    Reply
  • pyrofire95
    Dear all writers.
    The more salt you tell us to put in our beliefs of a rumored the more you're saying we should believe it.
    Reply
  • King_V
    pyrofire95 said:
    Dear all writers.
    The more salt you tell us to put in our beliefs of a rumored the more you're saying we should believe it.
    It sounds like you're saying the opposite of what you should be.
    Reply
  • warezme
    From one looking forward to a 4090 upgrade I wasn't thrilled about the alleged power usage of 4000 series cards even though I have a 1000w PS already and an updated 1060w waiting in the wings. I am hoping the power for the 4090 is not as high as it has been rumored.
    Reply
  • alceryes
    JarredWaltonGPU said:
    Oops, I fixed the 4070 power columns. But yeah, I've long been skeptical of the 450-600W power claims for the 4080 and 4090. Custom cards will probably hit those levels, but I suspect the Founders Edition will be far more reasonable.

    Part of me wonders if it wasn't all a disinformation campaign. Get people angry about 600W cards, then announce 450W cards and everyone is happy. Where 3090 Ti was "Wow, this uses a ton of power!" the 4090 will be, "Hey, this only needs 450W, not 600W. Awesome!"
    Or...(puts on AMD fanboy cynic hat)
    NVIDIA realized that they're not gonna catch AMD in raw rasterization power next gen (at least not on initial release) and decided to scale back on the watts they were pushing to catch the RX 7900 XT's rumored performance.
    Reply
  • Toyashi
    I love how it says that the power consumption was dropped "unexpectedly" as if anyone who isn't legally insane expected Nvidia to release a series of GPUs that would be nigh impossible to air cool, let alone make OEM models for pre-builts for.

    Also, nobody in their right mind would ever conclude that a physically smaller die on a smaller and more efficient manufacturing process would consume MORE power, compared to a older and lager die.
    It literally doesn't make sense.
    I think people should stop listening to Twitter users with anime profile pics with their "insider knowledge" and instead let logic and critical thinking prevail every now and then.
    Reply