AMD's FSR Uses Lanczos Tech, Just Like Nvidia's Years-Old Sharpening Filter

AMD FSR
(Image credit: AMD / Tom's Hardware)

When AMD originally announced it was working on FidelityFX Super Resolution (AMD FSR), an image upscaling and enhancement algorithm that would work on all of the best graphics cards and allow AMD to compete with Nvidia's DLSS, it raised a lot of questions. Would it use machine learning, or some other upscaling technique? AMD has apparently claimed the algorithm was built "entirely in-house," but people have looked at the source code and determined that it's largely based on the existing Lanczos resampling algorithm, which has been around for several decades and is already in use by Nvidia for its sharpening filter. Ah, the joys of open-source software.

To be fair, AMD has done more than just straight-up reusing Lanczos resampling. Specifically, FSR includes some optimizations to allow it to run faster, along with some other filters that help remove any halos caused by the sharpening. But, perhaps most importantly, AMD threw its weight behind creating an open-source solution that game developers — or really anyone — could incorporate into their applications. The ideas behind upscaling and enhancing video content aren't remotely new, but sometimes it takes a bit of elbow grease to get everyone on the same page.

One key point with FSR is that it's supposed to be incorporated into a game, so that it only gets applied to the 3D rendered content and not to other things like UI elements or text. Of course, there's nothing stopping people from using FSR to upscale everything — and in fact, several projects exist that aim to do just that — but some things are best left rendered at native resolution. By putting some work into optimizing and standardizing the FSR algorithm, AMD has already managed to get at least two dozen games to adopt the technology, along with Unreal Engine and Unity Engine.

Ironically, Nvidia also adopted Lanczos upscaling and sharpening as a filter in its drivers that was first added years ago. Could it have put in the work to get developers to use Lanczos upscaling instead of temporal upscaling — or in addition to temporal upscaling — back in the Pascal GPU generation? Absolutely. But instead, it was left as a mere filter while Nvidia put its engineering efforts to work creating DLSS, an AI-driven upscaling and enhancement algorithm that's proprietary to Nvidia RTX GPUs. And to be fair, DLSS 2.0 and later revisions do work quite well, arguably better than Lanczos resampling.

But as we noted in our FSR testing, having a general-purpose algorithm that works on any modern GPU — everything from Intel UHD 630 to Nvidia RTX 3090 to AMD RX 6900 XT — has a lot of benefits. For example, if we take a look at the current Steam Hardware Survey (focusing on the DirectX 11 GPUs), we find that only 17.6% of all PCs in the most recent survey have an RTX card. That means over 80% of the gaming market can't currently make use of DLSS. In contrast, every PC in the survey with a DirectX 11 or later GPU should be able to use FSR.

As a former software developer, I can attest to the fact that it's far easier to get management to greenlight a new feature when said feature benefits 100% of the intended user base, rather than only a small portion of the potential users. The needs of the many outweigh the needs of the few, or something like that.

Of course, the proof is in the eating of the pudding, and FSR pudding tastes nearly as good as natively rendered pudding — maybe a bit undermixed, but you almost wouldn't notice, at least when using the ultra quality or quality profiles. Let's just not get too carried away with congratulating AMD on creating something new and useful when what we really should be doing is asking what took so long.

If one of the companies behind our modern graphics APIs — meaning, Microsoft and DirectX or Khronos and Vulkan/OpenGL — had integrated Lanczos upscaling and promoted it as a useful function for game developers five or ten years ago, we could have been benefiting from the technology all those years. But then maybe people would have been able to sit on a slightly older GPU for another year or two before upgrading, which maybe wasn't deemed in the best interest of AMD or Nvidia until now.

Anyway, FSR is pretty useful, even if it's just a modified take on a decades-old algorithm. Sometimes the best solutions are the old solutions. DLSS is also great if you have an RTX card — sorry, GTX and AMD owners. Hopefully, we'll see more widespread adoption of both technologies in future games, because choice is almost always a good thing to have.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • Alvar "Miles" Udell
    Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

    By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

    So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
    Reply
  • vinay2070
    Alvar Miles Udell said:
    Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

    By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

    So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
    2 things.
    It is required because Raytracing got introduced.
    It makes sense because few years back games were mostly played on 1080 60Hz. Now 1440P high refresh rate and 4K are slowly becoming a norm. And upscaling a 720P to 1080 might have looked a lil crap back then, Upscaling a 1080 or 1440P to 4K dont look crap.
    Reply
  • digitalgriffin
    Alvar Miles Udell said:
    Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

    By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

    So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...

    As games advance a 1080p card where you could run maxed our 4 years ago (Say RX580) can no longer run maxed out. Details/Shaders and game engines get more complex all the time. I now consider a RX580 bottom of the barrel for 1080p gaming. Acceptable, but entry level
    Reply
  • watzupken
    In my opinion, Nvidia could have done the same, but of course they will always choose the proprietary route to differentiate themselves from competition, and also to entice existing users to upgrade. I don't recall that Nvidia have in the recent years, released a new technology that is not proprietary. It is usually after a few years and under pressure will they make some technology more open.
    Reply
  • watzupken
    Alvar Miles Udell said:
    Here's the thing though, at least to me. Going by the Steam Hardware Survey 67% of users use 1920x1080 resolution, and GPUs have been capable of 1080p60 gaming for several years now, even the most popular card on the survey, the 5 year old GTX 1060, can do it with little to no detail compromises on most games.

    By using any subsampling-upscaling filter or feature, be it DLSS, FSR, or an inbuilt subsampling feature which some games have, you're making a detail compromise, for FPS purposes. When nVidia and AMD were caught doing this in their drivers, I remember there was a massive outcry from all the reputable tech sites, including TomsHardware, at how underhanded it was, but now in 2021 it's a desired feature to cut quality in the name of FPS?

    So to say FSR is a feature which benefits 100% of the userbase is a stretch. For laptops and IGPs certainly it's a benefit given the weaker power of them, and it's much akin to how this generation of consoles use dynamic resolution to maintain FPS, but for discrete card PC users...
    Steam survey is backward looking, and to be honest, recent game titles are getting increasingly taxing on the GPU. FSR basically opens up the possibility of you running the game at an acceptable or preferred framerate. In my opinion, I think it makes sense to use FSR when one is forced to lower resolution from 1080p straight down to 720p. FSR at ultra quality settings don't go down directly to 720p, so that helps preserve some details, and at the same time, smoothens out the jaggies that is very obvious at low resolution. So overall, I feel it is a win. Even if I were to match FSR using a 720p (to upscale to 1080p) vs native 720p, I believe the former is still going to be a better alternative. So no complains here. If you think that is cheating, then just don't use DLSS or FSR. I think the feature is there where you can toggle on or off, so when it comes to review, reviewers are transparent about whether DLSS or FSR is used. So I don't see any cheating in this case.
    Reply
  • Alvar "Miles" Udell
    vinay2070 said:
    2 things.
    It is required because Raytracing got introduced.
    It makes sense because few years back games were mostly played on 1080 60Hz. Now 1440P high refresh rate and 4K are slowly becoming a norm. And upscaling a 720P to 1080 might have looked a lil crap back then, Upscaling a 1080 or 1440P to 4K dont look crap.

    1) Ray tracing is still in its infancy and newer methods are being developed which are far more efficient. Remember the ReSTIR algorithm, and nVidia's brilliant demonstration video, last year that was 6-60x more efficient? It's still very much a brute force application.

    2) But TomsHardware's own review states that the quality loss becomes noticeable at the "balanced" setting and only gets worse from there. To quote, "There's an often perceptible loss in image quality, especially if you go beyond the Ultra Quality mode", and, "FSR has no qualms about scaling to higher fps, and if you don't mind the loss of image quality, running in Performance mode often more than doubles performance. (So does running at 1080p instead of 4K.) "

    Which goes back to my original argument: If you're already compromising on details and IQ by using FSR, why not just run it at a lower resolution, especially if you play an even fraction of the native resolution so the upscaling isn't fractional? Or better yet, why not cut out some of the details which may not amount to much of anything visually, especially compared to artifacts or blurry textures? All you're really losing then is the meaningless ability to say you're playing at X resolution on Y weak card.
    Reply
  • watzupken
    Alvar Miles Udell said:

    Which goes back to my original argument: If you're already compromising on details and IQ by using FSR, why not just run it at a lower resolution, especially if you play an even fraction of the native resolution so the upscaling isn't fractional? Or better yet, why not cut out some of the details which may not amount to much of anything visually, especially compared to artifacts or blurry textures? All you're really losing then is the meaningless ability to say you're playing at X resolution on Y weak card.
    The reason why these upscaling technologies are showing you 1080p instead of saying 720p is because the results in terms of picture quality is supposed to land somewhere in the middle, and ideally closer to the higher resolution IQ. As I mentioned, you have the option for a straight downgrade in resolution or graphical settings to boost performance. But the end results more than often looks a lot worst than running say DLSS or FSR at high quality settings. One immediate issue with downgrading from 1080p to a native 720p is the jaggies, which is not as bad when running FSR or DLSS.

    Image quality loss at lower settings is not unexpected. The technology is basically trying to enhance whatever is on the screen, i.e. 720p or lower. Most reviews have concluded that one should not go lower than Quality settings, or ideally stick to Ultra Quality settings for FSR. In my opinion, at 1080p, there is very little reason to utilize anything less than the Quality settings. As resolution decreases, CPU bottleneck increases, so you are not going to see a meaningful improvement in FPS anyway, despite the image quality hit.
    Reply
  • Alvar "Miles" Udell
    watzupken said:
    Steam survey is backward looking, and to be honest, recent game titles are getting increasingly taxing on the GPU. FSR basically opens up the possibility of you running the game at an acceptable or preferred framerate. In my opinion, I think it makes sense to use FSR when one is forced to lower resolution from 1080p straight down to 720p. FSR at ultra quality settings don't go down directly to 720p, so that helps preserve some details, and at the same time, smoothens out the jaggies that is very obvious at low resolution. So overall, I feel it is a win. Even if I were to match FSR using a 720p (to upscale to 1080p) vs native 720p, I believe the former is still going to be a better alternative. So no complains here. If you think that is cheating, then just don't use DLSS or FSR. I think the feature is there where you can toggle on or off, so when it comes to review, reviewers are transparent about whether DLSS or FSR is used. So I don't see any cheating in this case.

    Nvidia Points Finger at AMD's Image Quality Cheat | Tom's Hardware (tomshardware.com)

    That is to what I was referring. There was an incident before but that article has been removed due to age no doubt.

    Personally I don't care what people do, they can run at 4k performance minimal details on a RTX 3090 for over 9000 FPS if that's what they want, all I'm saying is it doesn't make sense to me to play at a subsampled higher resolution with visual artifacts than at an aspect proportional lower resolution, especially if your goal is to just say you're playing at 4K on a weak card, since you can do that anyway with AMD's VSR.

    And I say this as someone who before last year had a Fury Nano (which AMD unofficially abandoned long ago, so I abandoned them) on a brilliant non-gaming LG 4K display, and 1920x1080 on a 4K display with nothing but the monitor's native scaler looks very nice.
    Reply
  • Alvar "Miles" Udell
    watzupken said:
    The reason why these upscaling technologies are showing you 1080p instead of saying 720p is because the results in terms of picture quality is supposed to land somewhere in the middle, and ideally closer to the higher resolution IQ. As I mentioned, you have the option for a straight downgrade in resolution or graphical settings to boost performance. But the end results more than often looks a lot worst than running say DLSS or FSR at high quality settings. One immediate issue with downgrading from 1080p to a native 720p is the jaggies, which is not as bad when running FSR or DLSS.

    But you wouldn't drop from 1920x1080 to 1280x720, there's 1600×900, 1680×1050, and 1440×900 between those, and you'd start there using your monitor or driver settings to maintain aspect ratio scaling option so you don't get stretched garbage.
    Reply
  • JarredWaltonGPU
    Alvar Miles Udell said:
    Nvidia Points Finger at AMD's Image Quality Cheat | Tom's Hardware (tomshardware.com)
    That is to what I was referring. There was an incident before but that article has been removed due to age no doubt.
    It's one thing when a company does less work without disclosing what's going on. If AMD or Nvidia did FSR-like upscaling but didn't tell anyone the GPU was really only rendering at 80% of the resolution or whatever, that's cheating. If it's a setting in a game that you can choose to use -- or not use -- that's completely different. A lot of games have resolution scaling now, but they tend to use temporal scaling rather than something like FSR. Epic claims it has a scaling algorithm that's "at least as good as FSR" or some such, I think, but it also supports FSR in Unreal Engine if devs want that. As long as a game has settings that can be turned on/off -- like variable rate shading -- I'm all for potential improvements in performance that have a negligible loss in image quality being an option.
    Reply