RTX 5070 allegedly delayed until early March to counter AMD RX 9070 launch

RTX 5090 Gallery Shot
(Image credit: Nvidia)

A new rumor suggests that Nvidia's mid-range RTX 5070, previously slated for a launch later this month, has been delayed to early March, per MEGAsizeGPU at X. Its elder Ti-class sibling should still arrive on time. However, considering the infamous RTX 5090/RTX 5080 launch, we cannot comment on GPU's retail availability. If true, it's probably no coincidence that this "delay" coincides with AMD's Radeon RX 9070 series.

To recap, the RTX 5070 was unveiled by Jensen Huang at CES last month, wielding the GB205 GPU at its heart with 48 SMs (6,144 CUDA cores) and 12GB of GDDR7 memory. The GPU offers a 192-bit interface, equating to six memory modules, filled by 28 Gbps GDDR7 ICs for 672 GB/s of bandwidth. Nvidia has set a launch MSRP of $549 for the RTX 5070, which is $50 cheaper than the last-generation RTX 4070. Instead of declaring a definite launch date, Nvidia said the RTX 5070 would supposedly launch with the RTX 5070 Ti in February, but that's no longer true.

Renowned tipster MEGAsizeGPU, in a tweet, alleged that the RTX 5070 is delayed, with retail availability pushed back to early March, likely to thwart AMD's RDNA 4 launch. In light of recent leaked benchmarks putting the RX 9070 XT just behind the RTX 4080 Super (in raster), Nvidia's RTX 5070 marketing may hinge on pricing instead of performance.

During the RTX 50 reveal, Jensen Huang very proudly asserted the RTX 5070 is equal to the RTX 4090 (with MFG), which might be misleading for many unsuspecting consumers. The RTX 5080 barely edges the RTX 4080 Super by just 9%, which is quite disappointing for a gen-on-gen uplift, especially considering the RTX 4080 was 38% faster than the RTX 3080 Ti. In terms of other capabilities, Blackwell offers many new features such as Reflex 2.0, Smooth Motion, Multi Frame Generation, and.... more AI horsepower. Still, it's best to temper expectations and anticipate a 15-20% uplift in (raster) performance, at best, versus the RTX 4070.

Both cards in the RX 9070 family are reportedly packed with 16GB of memory, and as a result, AMD is marketing these GPUs as 4K-ready. The same cannot be said about the RTX 5070, which might struggle with VRAM-hungry titles at higher resolutions. Even though Nvidia is working on Neural Texture Compression (NTC) technology to reduce texture memory footprint, it's still a ways off from mainstream adoption.

Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

Read more
Nvidia GeForce RTX 5070 Founders Edition
Nvidia RTX 5070 Founders Edition won't be available on launch day, coming 'later in March'
Nvidia GeForce RTX 5070 announcement
RTX 5070 Ti may hit the market on February 20 — European retailer confirms imminent launch
Asus RTX 5070 Prime
RTX 5070 appears on Best Buy at MSRP — Asus Prime listed for $549.99 days before official launch
The Asus RTX 5070 Ti Prime OC graphics card
French retailer suggests Nvidia will launch the RTX 5070 Ti on February 20
Leaked Chaintech roadmap
RTX 5060 Ti and RTX 5060 may arrive in March to steal AMD's spotlight — Chaintech hints at higher Average Selling Prices
NVIDIA-GeForce
Leaked RTX 5070 benchmarks show mixed results against RTX 4070 Super, 18% slower than RTX 5070 Ti
Latest in GPUs
RTX 4090 48GB
Blower-style RTX 4090 48GB teardown reveals dual-sided memory configuration — PCB design echoes the RTX 3090
Despite external similarities, the RTX 3090 is not at all the same hardware as the RTX 4090 — even if you lap the GPU and apply AD102 branding.
GPU scam resells RTX 3090 as a 4090 — complete with a fake 'AD102' label on a lapped GPU
WireView Pro 90 degrees
Thermal Grizzly's WireView Pro GPU power measuring utility gets a 90-degree adapter revision
Nvidia Ada Lovelace and GeForce RTX 40-Series
Analyst claims Nvidia's gaming GPUs could use Intel Foundry's 18A node in the future
RX 9070 XT Sapphire
Lisa Su says Radeon RX 9070-series GPU sales are 10X higher than its predecessors — for the first week of availability
RTX 5070, RX 9070 XT, Arc B580
These are the best GPU 'deals' based on real-world scalper pricing and our FPS per dollar test results
Latest in News
TSMC building
TSMC to reportedly speed up fab building in the US, third fab to begin construction this year
Intel
Ex-Intel CEO Gelsinger warns TSMC's $165B investment will not restore U.S. semiconductor leadership
ReFS in Windows 11 preview build installer
New Windows file system option supports up to 35 petabyte volumes — ReFS appears in latest Insider build
New Windows 11 Game Bar Update
Microsoft updates the Windows Game Bar to be more user friendly with PC Handhelds
Raspberry Pi
Raspberry Pi Pico fightstick randomly mashes buttons for fighting game combos
The world's first color e-paper display over 30-inches
Mass production of 'world's first' color e-paper display over 30-inches begins
  • valthuer
    8 years after the 11 GB 1080 Ti, it’s nothing sort of disappointing that we ‘re witnessing the release of 12 GB GPUs, like 5070.

    With AAA gaming becoming more demanding with each passing day, 16 GB should be a bare minimum for graphics cards, IMHO.
    Reply
  • edzieba
    valthuer said:
    8 years after the 11 GB 1080 Ti, it’s nothing sort of disappointing that we ‘re witnessing the release of 12 GB GPUs, like 5070.

    With AAA gaming becoming more demanding with each passing day, 16 GB should be a bare minimum, IMHO.
    On the other hand, the first game directly impacted by not having 12GB vRAM available (at UHD render resolution) - The Last Circle - was only just released, whilst the first GPU with 12GB vRAM available was released a decade ago (2015's GTX Titan X). Real-world vRAM requirements in games have just not scaled up that much over time, and per-DRAM-die memory bandwidth has continued to grow without the need to brute-force it with wider bus widths (which requires more DRAM dies, which means more RAM capacity by default because leading-edge DRAM dies are only made in so small a capacity).
    Reply
  • atomicWAR
    Looks like Nvidia wants to try and steal some of AMD's thunder. But considering the state of launched 50 series products I find it more likely Nvidia might end up helping AMD rather than hurt them. 50 series cards have been so disappointing. Unless AMD really botched their GPUs this may may well play out in their favor. Plus hearing rumors AMD might actually launch a higher end SKU now, I suspect their 'refresh' chips might have more umpff than Nvidia would like.
    Reply
  • atomicWAR
    edzieba said:
    On the other hand, the first game directly impacted by not having 12GB vRAM available (at UHD render resolution) - The Last Circle - was only just released, whilst the first GPU with 12GB vRAM available was released a decade ago (2015's GTX Titan X). Real-world vRAM requirements in games have just not scaled up that much over time, and per-DRAM-die memory bandwidth has continued to grow without the need to brute-force it with wider bus widths (which requires more DRAM dies, which means more RAM capacity by default because leading-edge DRAM dies are only made in so small a capacity).
    Hardly, 12GB of VRAM even at 1440P has been problematic since the launch of 40 series cards and even longer at 4k/UHD. Granted its not all games but its to many to be ignored. The simple truth is 12 GB is not enough VRAM for modern gaming. Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings. Honestly 16GB should be the bare minimum cards launch with in this day and age IMO.
    Reply
  • valthuer
    atomicWAR said:
    Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings.

    Yep. And Resident Evil 4 remake, requires 13.73 GBs of VRAM, at max 1080p settings.

    Along with 2-3 more games, this was the very reason i decided to get rid of my 4070 Ti, back in 2023, and buy 4090.

    I was getting CTDs with Direct3D fatal errors, due to my card's insufficient VRAM.

    Haven't encountered any problems ever since.

    Memory bandwidth, sure is helpful, but it's not enough to save you by itself.
    Reply
  • Shiznizzle
    atomicWAR said:
    Hardly, 12GB of VRAM even at 1440P has been problematic since the launch of 40 series cards and even longer at 4k/UHD. Granted its not all games but its to many to be ignored. The simple truth is 12 GB is not enough VRAM for modern gaming. Games like Ratchet and Clank a Rift apart can exceed 12GB of VRAM at 1080P with max settings. Honestly 16GB should be the bare minimum cards launch with in this day and age IMO.
    I have a 3060 12GB. A comment i read this week stated that 8 GB VRAM was not enough or barely enough to run at 1080p. Since this is my monitor setup i fired up the most demanding game i had in my library. Tomb Raider Shadow, to see what it was demanding. I wanted to see for myself what the score was.

    This was a game which did run on my previous 1060 6GB, but ran nowhere near max settings. It was a game of compromises between getting decent refresh rates and eye candy. Lots of sacrifices. I would rather have the game be playable with playable refresh rates and no stuttering, than lots of eye candy and the FPS yo-yo-ing, which is noticeable.

    So i fired up the 3060 and turned all to max at 1080p. My monitor is limited to 60Hz so that is my refresh rate. It did run it at max and at 60 FPS. Memory usage was at nearly 7 GB though. And that on a 1080p.

    More and more people are wanting to move up to 1440p. I am one of them. With the up coming 5060 and its 8 GB i think that is not enough for me. But then again i am going AMD this time around no matter what the specs say. Linux is easier to deal with on an AMD card.

    Me thinks the 5060 will be barely able to handle 1440P with sub 60 FPS with its 8GB of Vram. Lots of people are goign to be disappointment.
    Reply
  • RTX 2080
    Shiznizzle said:
    I have a 3060 12GB. A comment i read this week stated that 8 GB VRAM was not enough or barely enough to run at 1080p. Since this is my monitor setup i fired up the most demanding game i had in my library. Tomb Raider Shadow, to see what it was demanding. I wanted to see for myself what the score was.

    This was a game which did run on my previous 1060 6GB, but ran nowhere near max settings. It was a game of compromises between getting decent refresh rates and eye candy. Lots of sacrifices. I would rather have the game be playable with playable refresh rates and no stuttering, than lots of eye candy and the FPS yo-yo-ing, which is noticeable.

    So i fired up the 3060 and turned all to max at 1080p. My monitor is limited to 60Hz so that is my refresh rate. It did run it at max and at 60 FPS. Memory usage was at nearly 7 GB though. And that on a 1080p.

    More and more people are wanting to move up to 1440p. I am one of them. With the up coming 5060 and its 8 GB i think that is not enough for me. But then again i am going AMD this time around no matter what the specs say. Linux is easier to deal with on an AMD card.

    Me thinks the 5060 will be barely able to handle 1440P with sub 60 FPS with its 8GB of Vram. Lots of people are going to be disappointment.
    The XX60 series has been a 1080p-class card for at least the last decade. They can be used at 1440p, but they aren’t intended for it.

    When the 2060 super came out, everyone said to skip the 2070 for 1440p and buy the 2060 super instead. The same thing happened with the 3060 Ti. Problem is, they very quickly became inadequate for 1440p gaming.

    The RTX 5060 is intended for 1080p. It’ll do that adequately. You want more VRAM so that you can play at 1440p? Buy a 5070. Want even more? Buy a 5070 Ti. Or an AMD card, it’s up to you.
    Reply
  • Gururu
    I just can't help but feel that the 90X0 line up is going to be smooth and polished. They've patiently waited and have been rewarded with news after news of nVidia falling on their face.
    Reply
  • bourgeoisdude
    Gururu said:
    I just can't help but feel that the 90X0 line up is going to be smooth and polished. They've patiently waited and have been rewarded with news after news of nVidia falling on their face.
    Never underestimate AMD's ability to mess up. I'm cautiously optimistic, but if they release the 9070xt at $700, they missed the point.

    That being said, I'm still pretty happy with my 7800xt.
    Reply
  • Mr Majestyk
    bourgeoisdude said:
    Never underestimate AMD's ability to mess up. I'm cautiously optimistic, but if they release the 9070xt at $700, they missed the point.

    That being said, I'm still pretty happy with my 7800xt.
    Did you see the leaked Canadian pricing. It's atrocious. AIB cards at $850+USD
    Reply