Next-gen GPUs likely arriving in late 2024 with GDDR7 memory — Samsung and SK hynix chips showed chips at GTC

GDDR7 at GTC 2024
(Image credit: Tom's Hardware)

Samsung and SK hynix were showing off their upcoming GDDR7 memory solutions at GTC 2024. We spoke with the companies, as well as Micron and others, about GDDR7 and when we can expect to see it in the best graphics cards. It could arrive sooner than expected, based on some of those discussions, and that has some interesting implications for the next generation Nvidia Blackwell and AMD RDNA 4 GPUs.

Micron didn't have any chips on display, but representatives said their GDDR7 solutions should be available for use before the end of the year — and the finalized JEDEC GDDR7 standard should pave the way. The chips shown by SK hynix and Samsung meanwhile would have been produced prior to the official standard, though that may not matter in the long run.

What's interesting about the information shown by SK hynix and Samsung is that both were showing off 16Gb (2GB) devices. We asked the companies about memory capacities and were told that 16Gb chips are in production and could show up in shipping products by the end of the year. 24Gb (3GB) chips on the other hand aren't going to be in the initial wave, and will very likely show up in 2025.

Horizon Forbidden West PC performance charts

(Image credit: Tom's Hardware)

More importantly, it's the GPUs below the top two solutions — from Nvidia, AMD, and Intel — where those 24Gb chips become important. We're now seeing plenty of games where 12GB of VRAM is basically the minimum you'd want for maxed out gaming performance. Look at Horizon Forbidden West as an example, and pay attention to the RTX 3080 10GB and RTX 4070 12GB cards. Those two GPUs are basically tied at 1080p and 1440p, but performance drops substantially on the 3080 at 4K with maxed out settings as it exceeds the 10GB VRAM.

If we're already getting games that need 12GB, it would only make sense to begin shipping more mainstream-level GPUs that have more than 12GB. AMD's RX 7800 XT and RX 7900 GRE both have 16GB for around $500–$550, while Nvidia's RTX 4070 and RTX 4070 Super only have 12GB because they use a 192-bit memory interface. But if Nvidia waits for 24Gb GDDR7, that same 192-bit interface can easily provide 18GB of total VRAM — and double that figure in clamshell mode with chips on both sides of the PCB.

Even more critically, 24Gb GDDR7 means a narrower 128-bit interface — which has been a serious cause for concern with the RTX 4060 Ti and RTX 4060 — wouldn't be as much of a problem. Those would still be able to provide 12GB of memory with one device per 32-bit channel, so there'd be no need for a consumer RTX 4060 Ti 16GB. And naturally the same math applies to AMD, where a future upgraded RX 7600-level GPU would get 12GB instead of 8GB.

And it's not just about memory capacity, though that's certainly important. GDDR7 will have speeds of up to 32 Gbps according to Samsung, while SK hynix says it will have up to 40 Gbps GDDR7 chips available. Even if we stick with the lower number, that's 128 GB/s per device, or 512 GB/s for a 128-bit interface, and 768 GB/s for a 192-bit interface. Both would be a substantial increase in memory bandwidth, which would take care of the second concern we've had with the lower tier GPUs of the current generation. 40 Gbps GDDR7 would bump the 128-bit interfaces up to 640 GB/s and the 192-bit bus to 800 GB/s, though we suspect we won't see such configurations in consumer GPUs until late 2025 at earliest.

This isn't new information as such, but it's all starting to come together now. Based on what we've heard and seen, we expect the next-gen Nvidia and AMD GPUs will fully embrace GDDR7 memory, with the first solutions now likely to arrive before the end of 2024. Those will be extreme performance and price models, with wider interfaces that can still provide 16GB to 32GB of memory. The second wave could then take the usual staggered release and come out in 2025, once higher capacity non-power-of-two GDDR7 chips are widely available. Hopefully this proves correct, as we don't want to see more 8GB graphics cards launching next year — not unless they're priced well south of the $300 mark (like $200 or so).

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.