AMD's next generation graphics cards based on the RDNA 3 architecture have passed conformity tests with Khronos Group's Vulkan 1.3 application programming interface. The compliancy indicates that the new graphics processors from AMD are functional and their drivers can pass conformity tests.
AMD plans to formally unveil its forthcoming graphics cards based on the RDNA 3 architecture on November 3, 2022, next week. Since the new GPUs are a little more than a week away, it's not particularly surprising that their drivers for Windows 10 and Ubuntu 5.15 Linux kernel are good enough to pass Vulkan 126.96.36.199 conformity tests, based on the list of conformant products published at Khronos.org (via VideoCardz).
The list currently includes not only an Undisclosed Product that belongs to the AMD RDNA 3 family of GPUs, but also an Undisclosed Product that belongs to AMD's Undisclosed Family of GPUs. While we do not know for sure how many of AMD's new products are compliant with the Vulkan 1.3 API, it's safe to say that at least two of them can pass the Vulkan 188.8.131.52 conformity tests.
AMD's codenamed Navi 3x graphics processors based on the company's RDNA 3 architecture are projected to substantially increase performance compared to existing Radeon RX 6000-series offerings. Also, the top-of-the-range Navi 31 is expected to use a multi-chiplet design.
Unofficial sources indicate that AMD plans to reveal two high-end RDNA 3-based graphics cards — a flagship offering and the one that is positioned slightly below — this year and start their sales in the second half of December. While prices of the new boards are unknown, there are rumors that AMD might position them higher than it positioned its Radeon RX 6900 XT at launch.
If TSMC says the next process is going to have 45% more performance for the same power Nvidia sees that and says, "So for 20% more power we get 52% more performance," and proceeds to make a product like a 4090, or 3090, or 2080 ti.
Whether AMD wants to admit it or not, they do not have the mindshare nVidia has, so they have to be cauteous. And even more now, since they're definitely in the spotlight when a lot of people on the fence are waiting on AMD to woo them into their fold. This type of opportunities does not come often and they better execute it in a humble, yet aggresive way.
Again, this needs to be an HD4870 moment.
AMD can make some good and cheap cards and chips but they have no idea when it comes to A.I.
If I'm not mistaken, they don't even have Tensor cores. This is going to be their first gen tensor cores. Nvidia has been feet-deep in A.I for uncountably many years.
FSR ≠ DLSS!
Just like the number of digits I have on one hand. I tried to use the other hand to count them but it turns out that hand was no help as it is uncountable as well.
as far as no point buying it without DLSS equiv, you don't have to like it or the way it does its business, but FSR is in fact functional even if it's not as good as DLSS. G-sync is technically superior to Freesync as well but it's been far from a deal breaker. It's all going to come back to the performance and the cost.
Second of all, again you can't compare DLSS with FSR because they are completely different.
One is using supercomputers @ 16k and tensor cores, the other doesn't have any tensor cores and merely uses clever algorithms.
Lastly, ? ----> Gsync by itself has been dead for years, literally everyone has been using the (unlocked) drivers for ages, with which you can use Gsync on your freesync monitor.
There's no point in buying a non-dlss card aka AMD GPU unless you're getting a REALLY amazing deal for a card that performs like a higher tier card (for example getting a card that performs like a 3090 Ti 3090 | 3080 Ti for a massive discount).
Otherwise for a small price or performance difference, no. You're spending hundreds of dollars, you may as well get as much for your cash while you're at it.
It's all about price and performance. DLSS with a 3080 would be sweet but not at the current price range.
Example: The 6800xt is performing similarly as the 3080 for 700€ (sometimes the xt outperforms the 3080, othertimes the 3080 outperforms the xt, on avg they are identical fps-wise).
The 3080 costs 800€ at least.
Though the 6800xt has only recently dropped a bit in price, before that (at least on the price charts for my EU country), it basically costs almost 800€ where my point comes into play: Is the price difference really worth not having DLSS at that point. -> No, it can truly do wonders fps-wise many years onwards.
Not just that you have those tensor and RT cores.
People forget it's not just DLSS and cores. It's also the tech behind them, sdk's, pieces of code etc.
Blender renders with Optix (RT, Denoising) MUCH faster than with CUDA. That's what this fancy "raytracing" and DLSS movement has also solidified.
Again it depends on you and your situation and location.
I can totally see people not caring about DLSS or AAA and just getting the 6800 for 700€ aka much less than the 3080, but then again spend-wisely, wait for sales, it's your money after all.
If they do have the cash, then its time to look at the prices for both Nvidia and AMD, see what you can afford in that range, and just then check the add-in technologies like DLSS, FSR, Ray Tracing, DP 2.1, Vulkan 184.108.40.206, Freesync, Gsync, etc..
Some of them, a tiny group who cares to learn about a product before buying it, may be able to stretch a little more to get thier favorite brand, or the technologie they want/need in that regard. But for the bulk of gamers who have no idea if Nvidia is this or that, or if AMD have X or Y, then its all about money, what they can afford, and whats available on their part of the world.