Skip to main content

AMD Promises Open Source Multi-Platform FidelityFX Super Resolution

Shutterstock GPU image
(Image credit: Shutterstock)

In a recent HotHardware Podcast with Scott Herkelman and Frank Azor, AMD spoke about its FidelityFX Super Resolution alternative to Nvidia's DLSS technology they promised for RDNA 2. AMD says it wants its version of resolution upscaling and anti-aliasing to be an open-source API and a multi-platform capable technology, similar to what it's done with other FidelityFX technologies like CAS (Contrast Aware Sharpening).

Right now, AMD is not ready to share technical details on its upscaling tech as it simply isn't ready. However, Scott Herkelman was able to share AMD's goals about it and what AMD wants to achieve with "its version of DLSS."

First AMD, wants to make this technology fully open source, with a non-proprietary API. This potentially gives game developers a much easier time in implementing the tech when it's ready. Instead of coding for a specific piece of hardware and specific game, it'll be plug and play basically with one piece of code. Second, the open standard, in turn, will make AMD's tech multi-platform capable. AMD says they want it to work on the new RDNA2 powered consoles, its own graphics cards, as well as Intel and Nvidia GPUs (though obviously the first two items in that list are the most important).

This seems like great news for this type of technology. The biggest issue with DLSS is that it's locked into Nvidia GPUs and APIs. It's true that implementing DLSS in widely used game engines like Unreal Engine has made it relatively simple to enable. However, developers that build their own engines still need to put in the extra work, all for a technique that only works on RTX cards.

There are two major questions regarding AMD's forthcoming FidelityFX Super Resolution. The first is simple: Will it look good? We know that DLSS 2.x can look very good, sometimes even surpassing native rendering. Part of that is simply due to the blurriness of temporal AA, however. Remove TAA and use DLSS to remove aliasing and upscale a frame and it's possible to get a more pleasing final rendering result.

Super computer artistic rendering

(Image credit: Shutterstock)

The other question is how it will work. Will Super Resolution use an AI trained network to determine the best methods of upscaling and anti-aliasing? Will it be able to make use of hardware acceleration features like Nvidia's Tensor cores? While we don't know AMD's answer for sure, there's a good chance it's a "no" on both aspects.

AMD's Super Resolution needs to work on all platforms, and only Nvidia tech has Tensor cores, so obviously AMD doesn't have any reason to build tech that relies on a feature it lacks. But as far as AI training goes, that's still entirely possible — Microsoft at least has a massive amount of supercomputing power available in Azure. The trained algorithm just needs to run on standard GPU cores, and with double performance fast math (FP16), RDNA2 GPUs could likely run such an inference network without too much difficulty.

That leads into the algorithm itself. AMD already offers resolution upscaling and enhancement via the custom-tuned CAS algorithm, which can improve visuals and is extremely low impact when it comes to performance (less than 1 percent difference). But CAS right now doesn't do nearly as well with upscaling, so Super Resolution just needs to focus on that aspect. Instead of the current methodology where a game does the rendering, applies an overly blurry temporal AA filter, undoes some of the blurring via CAS, and then upscales the final result ... what if Super Resolution can combine all of those steps into one superior filter? That would be the goal, at least in our minds.

Naturally, AMD has a vested interest in making this work, but more importantly AMD's console partners have even more of a need for tech like Super Resolution. The PlayStation 5 and Xbox Series X both theoretically offer up to 4K at 120Hz support, but hitting 120 fps at native 4K with GPU hardware that's already slower than a Radeon RX 6800 isn't going to happen without reducing image fidelity, or some other trick. Super Resolution fills that niche.

Overall, it's great that AMD is working on a completely open version of upscaling and anti-aliasing. If the FidelityFX Super Resolution tech can bring DLSS-type quality to other GPUs, potentially including existing RX 5000 series and even Nvidia GTX cards, it'll be a game changer. We still need to see the result, however, and we also need game developers to adopt it. That of course will take time.

  • splashmaker
    Based on how deep learning works, I don't think you'd want to combine it with CAS. The network should have visibility on all aspects of the image generation, CAS would be too basic and generate lower quality outputs compared to a DL model. Overall I think you might be able to get good quality without tensor acceleration but at a lower resolution or frame rate than with acceleration.
    Reply
  • VforV
    The bigger question is: will it work on the RX 5000 series? Because if it does that's a huge performance gain for RDNA 1, expanding it's life... and make the RX 5700 XT even more attractive vs the 2070 Super, than already is.
    Reply
  • FlxDrv
    VforV said:
    The bigger question is: will it work on the RX 5000 series? Because if it does that's a huge performance gain for RDNA 1, expanding it's life... and make the RX 5700 XT even more attractive vs the 2070 Super, than already is.
    Yes it will. Super Resolution will use directml to work and scince then new rdna2 have no hardware accelerated machine learning cores there is no reason that the can't enable the feature on older cards. You'll just need a directx12 gpu for it to work, so it may work all the way back to GCN1 (hd 7xxx) but rdna2 and rdna1 maybe getting it first and then older cards will get it after, if they need to adapt the feature to each architecture.
    I'm pretty confident that at least rdna1 and rdna2 gpu's will get it in the next big adrenaline update that comes out at every end of year, so in the next few days, to maybe a week or two hopefully.
    Reply