AMD's RDNA 3 GPUs Gain Vulkan 1.3 Compliancy

AMD
(Image credit: PowerColor)

AMD's next generation graphics cards based on the RDNA 3 architecture have passed conformity tests with Khronos Group's Vulkan 1.3 application programming interface. The compliancy indicates that the new graphics processors from AMD are functional and their drivers can pass conformity tests. 

AMD plans to formally unveil its forthcoming graphics cards based on the RDNA 3 architecture on November 3, 2022, next week. Since the new GPUs are a little more than a week away, it's not particularly surprising that their drivers for Windows 10 and Ubuntu 5.15 Linux kernel are good enough to pass Vulkan 1.3.3.1 conformity tests, based on the list of conformant products published at Khronos.org (via VideoCardz). 

The list currently includes not only an Undisclosed Product that belongs to the AMD RDNA 3 family of GPUs, but also an Undisclosed Product that belongs to AMD's Undisclosed Family of GPUs. While we do not know for sure how many of AMD's new products are compliant with the Vulkan 1.3 API, it's safe to say that at least two of them can pass the Vulkan 1.3.3.1 conformity tests. 

AMD's codenamed Navi 3x graphics processors based on the company's RDNA 3 architecture are projected to substantially increase performance compared to existing Radeon RX 6000-series offerings. Also, the top-of-the-range Navi 31 is expected to use a multi-chiplet design. 

Unofficial sources indicate that AMD plans to reveal two high-end RDNA 3-based graphics cards — a flagship offering and the one that is positioned slightly below — this year and start their sales in the second half of December. While prices of the new boards are unknown, there are rumors that AMD might position them higher than it positioned its Radeon RX 6900 XT at launch.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • helper800
    If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Cost is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

    If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
    Reply
  • -Fran-
    AMD is* probably asking Rocket Scientists and Brain Surgeons how to price this new batch of GPUs, because if they've been paying attention to nVidia's blunders (to call them something), their price will determine 50% of the reactions online, if not more.

    Whether AMD wants to admit it or not, they do not have the mindshare nVidia has, so they have to be cauteous. And even more now, since they're definitely in the spotlight when a lot of people on the fence are waiting on AMD to woo them into their fold. This type of opportunities does not come often and they better execute it in a humble, yet aggresive way.

    Again, this needs to be an HD4870 moment.

    Regards.
    Reply
  • KraakBal
    helper800 said:
    If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Performance is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

    If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
    Hopefully means Amd will comfortably beat Nvidia with lower wattage in a generation or two. Unless Nvidia manages to sell a 1kw GPU at that time..
    Reply
  • ikernelpro4
    helper800 said:
    If the top card AMD releases is even within 10-20% of the 4090 its going to have an MSRP of 1200 and maybe more. The lower tier card is probably going to be 1000 dollars. Performance is increasing at a rate of the performance increase as a percentage X2 it seems for the last 6 years.

    If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
    It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

    AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

    If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

    FSR ≠ DLSS!
    Reply
  • helper800
    ikernelpro4 said:
    It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

    AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

    If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

    FSR ≠ DLSS!
    Even so that is a feature and not a requirement. We will not know the viability of AMD's product until they release it and we see the numbers. My point still stands on pricing.
    Reply
  • MASOUTH
    It can perform as close to the 4090 as it wants, no DLSS equiv. = no point in buying it.

    AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.

    If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.

    FSR ≠ DLSS!

    Yes, uncountably.

    Just like the number of digits I have on one hand. I tried to use the other hand to count them but it turns out that hand was no help as it is uncountable as well.


    as far as no point buying it without DLSS equiv, you don't have to like it or the way it does its business, but FSR is in fact functional even if it's not as good as DLSS. G-sync is technically superior to Freesync as well but it's been far from a deal breaker. It's all going to come back to the performance and the cost.
    Reply
  • ikernelpro4
    MASOUTH said:
    Yes, uncountably.

    Just like the number of digits I have on one hand. I tried to use the other hand to count them but it turns out that hand was no help as it is uncountable as well.


    as far as no point buying it without DLSS equiv, you don't have to like it or the way it does its business, but FSR is in fact functional even if it's not as good as DLSS. G-sync is technically superior to Freesync as well but it's been far from a deal breaker. It's all going to come back to the performance and the cost.
    First of all, I have no idea what you're trying to say with the countability analogy.

    Second of all, again you can't compare DLSS with FSR because they are completely different.
    One is using supercomputers @ 16k and tensor cores, the other doesn't have any tensor cores and merely uses clever algorithms.

    Lastly, ? ----> Gsync by itself has been dead for years, literally everyone has been using the (unlocked) drivers for ages, with which you can use Gsync on your freesync monitor.

    There's no point in buying a non-dlss card aka AMD GPU unless you're getting a REALLY amazing deal for a card that performs like a higher tier card (for example getting a card that performs like a 3090 Ti 3090 | 3080 Ti for a massive discount).

    Otherwise for a small price or performance difference, no. You're spending hundreds of dollars, you may as well get as much for your cash while you're at it.

    It's all about price and performance. DLSS with a 3080 would be sweet but not at the current price range.
    Example: The 6800xt is performing similarly as the 3080 for 700€ (sometimes the xt outperforms the 3080, othertimes the 3080 outperforms the xt, on avg they are identical fps-wise).

    The 3080 costs 800€ at least.

    Though the 6800xt has only recently dropped a bit in price, before that (at least on the price charts for my EU country), it basically costs almost 800€ where my point comes into play: Is the price difference really worth not having DLSS at that point. -> No, it can truly do wonders fps-wise many years onwards.
    Not just that you have those tensor and RT cores.

    People forget it's not just DLSS and cores. It's also the tech behind them, sdk's, pieces of code etc.
    Blender renders with Optix (RT, Denoising) MUCH faster than with CUDA. That's what this fancy "raytracing" and DLSS movement has also solidified.

    Again it depends on you and your situation and location.
    I can totally see people not caring about DLSS or AAA and just getting the 6800 for 700€ aka much less than the 3080, but then again spend-wisely, wait for sales, it's your money after all.
    Reply
  • RodroX
    At the end of the day, for around 90% of gamers, whats important, before anythign else, is if they have enough cash to buy a card or not.

    If they do have the cash, then its time to look at the prices for both Nvidia and AMD, see what you can afford in that range, and just then check the add-in technologies like DLSS, FSR, Ray Tracing, DP 2.1, Vulkan 1.3.3.1, Freesync, Gsync, etc..

    Some of them, a tiny group who cares to learn about a product before buying it, may be able to stretch a little more to get thier favorite brand, or the technologie they want/need in that regard. But for the bulk of gamers who have no idea if Nvidia is this or that, or if AMD have X or Y, then its all about money, what they can afford, and whats available on their part of the world.
    Reply