Nvidia RTX 4080 Allegedly Overclocks to 3GHz at Default 320W TDP

GeForce RTX 4080 - Founders Edition
(Image credit: Nvidia)

According to a recent Chiphell forum post, Nvidia's GeForce RTX 4080 16GB was seen benchmarked in 3DMark TimeSpy while overclocked to an impressive 3GHz clock speed. At 3GHz, the chip outperformed Nvidia's previous generation RTX 3090 Ti (overclocked to 2,200MHz) by 3,500 points in the same benchmark.

The RTX 4080 16GB can be seen clocking up to 3000MHz flat in a GPU-Z screenshot shown in the Chiphell forum post. But most impressively, the 4080 is operating at the 3GHz peak while maintaining its default power rating of 320W, with a peak power rating of 333W measured on GPU-Z. Temperatures were also very good at 60C.

This is very different from the 3GHz RTX 4090 results we've seen online, where the end-user was forced to run the GPU at 600W of power consumption to hit these very high clock speeds. If the 4090 16GB 3DMark results are true , it appears to show us that lower core count models will have better overclocking capabilities compared to the RTX 4090.

But for now, take this data with a grain of salt since the RTX 4090 and RTX 4080 16GB review embargos have not yet been lifted. However, once we get the cards in for testing, we'll better understand how well these GPUs can really overclock.

But, overall, it appears like Nvidia's Ada Lovelace architecture - operating on the TSMC 4N node, does have enough headroom to achieve a flat 3GHz clock speed, no matter the SKU. Of course, the silicon lottery could play a role in how many GPUs achieve 3GHz flat. However, we wouldn't be surprised if most RTX 40 series GPUs can hit this targeted frequency based on the results we're seeing now for the RTX 4080 and RTX 4090.

TimeSpy Score Analysis

The RTX 4080 16GB achieved at 28,929 point graphics score in 3DMark's TimeSpy benchmark (the non-extreme version). This puts it exactly 3,518 points - or 13.8% ahead of the best RTX 3090 Ti result we found on 3DMark's benchmark browser - which features 25,411 points in the graphics score, with a hefty overclock of 2265MHz.

Unfortunately, we don't have any RTX 4090 TimeSpy results to compare against, so take this result for what you will. Again, RTX 4090 reviews will be out soon, so we'll have a better idea of how fast this 3GHz 4080 is once 3rd party reviews - and our own review, is out for the 4090.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • salgado18
    That is not steam from LN2 in the picture, that's smoke right after it hit the clock. /sarcasm
    Reply
  • giorgiog
    The NVIDIA marketing machine is trying so hard... Sorry, not interested. Still rockin' my 3080 from November 2020 with no plans to upgrade for at least 2 years.
    Reply
  • blacknemesist
    13% faster than the 3090 ti? The 3090 ti was BS for its price so comparing them does still not make the 4080 either good or bad, it just isn't as bad as the 3090 ti in price-to-performance.
    Reply
  • ikernelpro4
    It also costs 3 Giga Dollars.
    Reply
  • atomicWAR
    blacknemesist said:
    13% faster than the 3090 ti? The 3090 ti was BS for its price so comparing them does still not make the 4080 either good or bad, it just isn't as bad as the 3090 ti in price-to-performance.
    Agreed. The RTX 4080 16GB should be a 700-800 card. At this rate I'll just get a RTX 4090 as it sadly will have a better price to performace ratio at this point.
    Reply
  • Mandark
    Until people start voting with their dollars and stop buying it they’re going to keep increasing prices. I wonder are people that stupid? That they would keep purchasing at these high prices?
    Reply
  • renz496
    giorgiog said:
    The NVIDIA marketing machine is trying so hard... Sorry, not interested. Still rockin' my 3080 from November 2020 with no plans to upgrade for at least 2 years.
    Those that already own high end 30 series most likely not interested to upgrade for the next series. If people buy high end gpu every generation the problem is the user. P
    Reply