According to our resident Twitter leaker @kopite7kimi, Nvidia's upcoming GeForce RTX 4090 cracked the 15,000 point barrier in 3DMark Time Spy Extreme by a long shot, with a record-breaking graphics score of 19,000 points in the famous benchmark. Furthermore, if Kopite's news is accurate, it puts the RTX 4090's performance barrier well ahead of anything available today, including RTX 3090 Ti GPUs chilled on liquid nitrogen.
For some perspective, the current reigning champion of the 3DMark TimeSpy Extreme benchmark is user "biso biso," with an LN2 cooled EVGA RTX 3090 Ti Kingpin Edition graphics card punching out a world recorded graphics result of 14,611 points.
RTX 4090, TSE >19000July 18, 2022
Again, if Kopites data is real, engineering samples of Nvidia's 4090's are already hitting TimeSpy Extreme scores 30% higher than the highest-overclocked RTX 3090 Ti's running today (not to mention normal RTX 3090 Ti's). As a result, we could see even higher scores once the RTX 4090 hits the market.
This data seems to confirm what we've already heard in the past: that Nvidia's RTX 40 series lineup will feature one of the most significant performance jumps we've seen in a single GPU generation from Nvidia.
Current rumors speculate that the top die for the 40 series, AD102, will pack 71% more CUDA cores and SMs than Ampere's top die GA102. In addition, it will reportedly feature similar or higher clock speeds compared to RTX 30 series, thanks to a more efficient TSMC 5nm process. We don't expect the rest of Nvidia's 40 series dies to pack the same core count increase, but they all are expected to pack a lot more cores if 71% is the ceiling for the 40 series.
Power consumption is also rumored to go up extensively, with flagship 40 series cards rumored to hit as much as 500 to 600 watts of power consumption.
As a result, high-performance gains should be expected to compensate for the crazy high power requirements for the 40 series. However, take this news with a grain of salt since we don't have full confirmation on this data. But it does have merit based on the current information that we have on Nvidia's next-generation GPUs.