Microsoft Unveils Its Own Version of Nvidia's RTX Super Resolution

Microsoft
(Image credit: Microsoft)

Microsoft is introducing a new upscaling feature to its Edge browser called Video Super  Resolution - or VSR. This feature is a direct competitor to Nvidia's RTX Super Resolution and relies on similar machine-learning technology to upscale lower-resolution video to a higher resolution. The feature is currently available to Edge users running the Canary channel insider build of the browser.

Microsoft's implementation is designed specifically to reduce the amount of internet bandwidth required to stream videos to your PC, and is limited to just 720P videos or lower. Microsoft is using an AI upscaler to do the work, focusing on removing blocky compression artefacts to improve image quality. 

VSR's limitation to sub-HD resolutions (for now?) targets customers with bad internet connectors and older videos recorded before 1080P and 4K became the norm. This could be the first step in creating an upscaler that works with higher resolution videos.

VSR is a complete contrast to Nvidia's RTX Super Resolution, which does not have any resolution limitations from what we can tell. If you wanted to, you could upscale a 360P YouTube video to 4K. Microsoft's resolution limitation could be a limitation related to testing purposes, or it could be a limitation of the AI upscaler in general. But either way, Nvidia's solution does have a lot more flexibility.

If you have access to VSR, all you need to do is enter the command edge://flags/#edge-video-super-resolution, to enable the feature. All video formats should work, with the exception of movies and videos protected with DRM.

Contrary to RTX Super Resolution, VSR does work with both Nvidia and AMD GPUs; with the requirements being an RTX 20 series GPU or newer, or a Radeon RX 5700 series GPU or newer. We suspect more GPUs will be supported down the line, but this will depend on how GPU-intensive Microsoft's AI upscaler will be to run.

Mobile versions of these GPUs are supported as well, but there is a massive caveat; you need to manually force Microsoft Edge to run on the laptop's discrete GPU, and plug-in AC power at the same time.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • bit_user
    I'd hazard a guess that the resolution limit is due to not wanting to burn too much power upscaling videos that would benefit less from the process. I hope it's configurable.

    I also really hope it includes a decent deinterlacer, since some DVDs aren't simply 24p and there's plenty of terrestrial broadcast that's still 1080i.

    And what's the deal with DRM-protected content? That would include not just many DVDs, but any video from like Netflix, Amazon, etc. right? Pretty big limitation, there.
    Reply
  • ThisIsMe
    Microsoft's implementation is designed specifically to reduce the amount of internet bandwidth required to stream videos to your PC, and is limited to just 720P videos or lower. Microsoft is using an AI upscaler to do the work, focusing on removing blocky compression artefacts to improve image quality.

    VSR's limitation to sub-HD resolutions (for now?) targets customers with bad internet connectors and older videos recorded before 1080P and 4K became the norm.

    FYI - 720p is HD resolution

    HD = 720p = 1280x720p = High Definition
    FHD = 1080p = 1920x1080p = Full High Definition
    QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
    UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)

    Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.
    Reply
  • evdjj3j
    ThisIsMe said:
    FYI - 720p is HD resolution

    HD = 720p = 1280x720p = High Definition
    FHD = 1080p = 1920x1080p = Full High Definition
    QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
    UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)

    Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.

    I came to say the same 720p is HD, they don't even try. TIme for a lesson authors 720 is HD and Intel's first dGPU was the i740 from 1998.
    Reply
  • bit_user
    ThisIsMe said:
    Often 4K and UHD are used interchangeably,
    I hate it when people use "2k" to refer to 2560x1440. It's not - that's more like "2.6k"! If "4k" is 3840x2160, then "2k" should mean 1920x1080, as that's exactly half in each dimension. Better just not to use "2k", at all, since using it to mean 1920x1080 at this point will likely invite confusion.

    ThisIsMe said:
    and sometimes UHD is used to refer to 4K+HDR.
    I can't endorse that, either. Resolution shouldn't be conflated with HDR.

    evdjj3j said:
    Intel's first dGPU was the i740 from 1998.
    Except they weren't even called GPUs, back then.
    Reply
  • edzieba
    ThisIsMe said:
    FYI - 720p is HD resolution

    HD = 720p = 1280x720p = High Definition
    FHD = 1080p = 1920x1080p = Full High Definition
    QHD = 1440p = 2560x1440p = Quad High Definition (4xHD)
    UHD = 2160p = 3840x2160p = Ultra High Definition (4xFHD)
    1920x1080 is HD. 720p was a compromise (like EDTV) pushed by marketers wanting to sell lower quality display panels rather than by standards bodies setting transmission and storage resolutions (e.g. SMPTE, ITU-R, DCI).
    The situation was similar to the "HDR" screens available now that can accept a High Dynamic Range input, but only display it either squashed or clipped to fit within a standard dynamic range (and maybe with some global dimming thrown in to make things look even worse).

    Often 4K and UHD are used interchangeably, and sometimes UHD is used to refer to 4K+HDR.
    Completely false.
    UHD = 3840x2160 broadcast standard resolution. Standardised before HDR was even a consideration.
    4K = 4096x2160 Digital Cinema Initiatives standardised resolution (and other requirements, such as encoding, subsampling, etc).
    '4K' = Colloquially adopted term for consumer UHD. Maybe it markets better, who knows, but we're stuck with the pointless confusion now.
    Reply
  • -Fran-
    I wish they would just allow me to stream whatever into MPC-HC so Ican use madVR instead.

    Much like I love streaming into Winamp and using all the DSPs it comes with.

    Regards.
    Reply
  • palladin9479
    720p is definitely "HD" resolution, just like 1080p is "FHD" and so forth.

    https://www.tomshardware.com/reviews/what-is-hd,5745.html
    2160p being called "4K" is 100% a marketing gimmick, it sells better then saying "this device is in full 2160p". Video resolutions are measured on their vertical axis because that's how scanline rendering works and is the number of lines required for a single frame. The horizontal is measured as a multiplier of the vertical scanlines and we call that the Aspect Ratio (4:3, 16:9, 16:10, etc.).

    For those interested here is an organized list of resolutions and their actual names.

    https://en.wikipedia.org/wiki/Graphics_display_resolution
    Reply
  • g-unit1111
    Another crappy app store feature?

    Reply
  • bit_user
    deNameMo said:
    This might be the worst comment I've read in my entire life.
    Apparently that one hit close to home. What a way to call yourself out, though.

    deNameMo said:
    Anyone who refers to 1920x1080 as 2k and 2560x1440 as 2.6k should leave the internet,
    I don't. It was an appeal for consistency. In particular, using the term "2k" to mean 2560x1440 is highly misleading, because the width is 2/3rds of 3840 which most people seem to call "4k", regardless of whether you think they should.

    Mostly, what I want is for nobody to use the term "2k". Ever.

    deNameMo said:
    PS: 3840x2160 is not 4k, 4096x2160 is true 4k but that would require a read up on resolutions which would be too easy.
    I wouldn't mind seeing people stop calling it 4k, but that doesn't seem realistic. So, at least be consistent.
    Reply
  • bit_user
    palladin9479 said:
    Video resolutions are measured on their vertical axis because that's how scanline rendering works and is the number of lines required for a single frame.
    That's only true since we moved past the NTSC/PAL era. I think the main reason it got adopted as a shorthand is that it was the number which came right before the "i" or the "p", which indicates whether the signal is interlaced or progressive.

    If you go back further, you'd see people talking about "lines of resolution" in a very different way. Because the vertical resolution was set by the video standard (486 visible lines in NTSC, 576 in PAL & SECAM) the only variable was the level of horizontal detail you could see. And that was determined by the signal bandwidth (e.g. transmission, video tape, digital video sampling frequency, etc.).

    To be specific, lines of resolution determined the maximum horizontal frequency you could discern (I forget how much attenuation was acceptable) per picture-height.

    https://en.wikipedia.org/wiki/Television_lines

    It's irrelevant for our current purposes. But, perhaps anyone who's into vintage video gear (including retro gaming) will perhaps appreciate the history lesson. Because you might see some gaming console or video mode described this way, and now you'll understand exactly what it means.
    Reply