AMD says RDNA 4 GPUs are coming in early 2025 — RX 8000 will deliver ray tracing improvements, AI capabilities

AMD Radeon GPU
(Image credit: AMD)

Sales of AMD's Radeon graphics processors dropped sharply in the third quarter compared to the same period a year ago as the company prepared to introduce its next-generation products several months down the road, AMD's chief executive Lisa Su announced on Tuesday. The first products based on the all-new RDNA 4 architecture — which could become some of the best graphics cards — are set to be released in early 2025. 

"Graphics, revenue declined year over year as we prepare for transition to our next generation Radeon GPUs based on our RDNA 4 architecture," Su said at the company's conference call with analysts and investors. "In addition to a strong increase in gaming performance, RDNA 4 delivers significantly higher ray tracing performance, and adds new AI capabilities. We are on track to launch the first RDNA 4 GPUs in early 2025." 

AMD's lineup of new cards isn't expected to compete for the highest end of the market — AMD recently told us it plans to avoid the 'king of the hill' strategy. Instead, the company says it will focus on volume, which typically equates to mid-range high-volume models. 

Ray tracing performance improvements with AMD's RDNA 4 architecture was discussed in the industry for a while, though this is the first time a high-ranking AMD official has confirmed this. Some preliminary details on what could be improved in AMD's RDNA 4 architecture when it comes to ray tracing have already led, and that information indicates that we could expect doubled ray tracing performance of RDNA 4 GPUs compared to RDNA 3 GPUs, though at this point this is largely speculation. 

As for additional AI capabilities, it is reasonable to expect RDNA 4 to support more instructions aimed at AI workloads and data formats better suitable for machine learning. What remains to be seen is whether all of these enhancements for AI will be enabled on client graphics cards and not reserved for Radeon Pro add-in-boards aimed at workstations and servers. We also know that AMD is working to make its FSR fully AI-driven

An announcement in early 2025 almost certainly means that AMD intends to formally introduce its first RDNA 4-based graphics processors at CES. Launching a gaming GPU after the holiday season is not really common in the industry, as both AMD and Nvidia try to address the needs of gamers with new products before the holiday season when new games are released. This happened to AMD's RDNA 2 and RDNA 3 families of products and multiple lineups before that. 

Still, GPU designers tend to unveil their new laptop GPUs along with their notebook partners, and the latter prefer to showcase their latest products at trade shows. Therefore, if AMD decides to start rolling out its RDNA 4-based offerings from notebook GPUs (another uncommon decision), then CES would seem to be the right time for a launch.

TOPICS
Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • Lucky_SLS
    AMD delaying the launch to price their top end GPUs against the 5070/5080 from Nvidia.

    Both GPU and CPU, AMD announces after competition to maximise their profit.
    Reply
  • watzupken
    I think they are delaying it to 2025 to clear as much RDNA3 during the holiday sales. But looking forward to see how much improvements will RDNA4 bring.
    Reply
  • ezst036
    AMD's chips would be more competitive if they would just simply make them bigger.
    Reply
  • Amdlova
    AMD have focus on enterprise now... game is a bonus
    Reply
  • YSCCC
    ezst036 said:
    AMD's chips would be more competitive if they would just simply make them bigger.
    Cheaper maybe...

    TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
    Reply
  • Notton
    ezst036 said:
    AMD's chips would be more competitive if they would just simply make them bigger.
    I doubt that.
    RDNA3 isn't selling because it's too expensive for it's judged value. They have good raster, but most buyers in 2024 want ray-tracing.
    FSR+AFMF was a disaster until this year, but then nvidia already answered it with a smooth working DLSS+framegen, on top of ray-tracing.

    Hopefully AMD finally learns its lesson and prices the 8800XT aggressively for its feature set. When you're behind, merely catching up and offering the same price isn't going to pursuade buyers.
    Reply
  • tamalero
    I wonder how long until they actually archive full chiplet to reduce costs.
    Reply
  • kyzarvs
    YSCCC said:
    Cheaper maybe...

    TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
    Depends where you are. As someone who has just bought a 7900XT for 1440p gaming in the UK, price was a major factor. According to the Toms 1440p Ultra chart, the 7900XT falls above a 4070ti Super and below a 4080. The '70 has lower RAM, lower raster and costs more. The '80 is better all round and costs a *lot* more. It was a no-brainer for me.
    Reply
  • ottonis
    YSCCC said:
    Cheaper maybe...

    TBF if the 7900XT and XTX were not priced so close to Nvidia competition it will be more competitive, their issue is eating up more power, with a drawback in RT and yet not significantly cheaper, so that when someone wants to buy a GPU, the price difference didn't justify the lack in game optimisation and RT
    I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
    I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
    Reply
  • salgado18
    ottonis said:
    I wonder how much of a difference RT does indeed make nowadays. Is there a substantial body of (AAA)games that rely on RT, and do these do indeed look considerably worse with fewer/lesser RT effects?
    I think that all this RT stuff is more marketing than a real-world issue, because the games that I have seen do look only marginally better with RT when compared to rasterization.
    It is mostly marketing, but marketing makes public perception and public perception makes demand. AMD lost the GPU battle because of RT and DLSS, even with better raster perf/price. Consumers view Radeons as inferior because they Geforces have the latest tech and the other don't.

    Doubling RT is good, but Nvidia already doubled a couple of times, so to be really competitive they should at least triple RT performance. Also add some AI power to it, and develop FSR 4 to take advantage of all of it.

    It doesn't look good, but I trust AMD's ability to at least keep in the game.
    Reply