Intel details game-boosting frame generation tech that applies a different technique — ExtraSS uses extrapolation instead of AMD and Nvidia's approach that uses interpolation

Intel Arc A750 Limited Edition
(Image credit: Tom's Hardware)

Intel is preparing to introduce its own frame generation technology similar to DLSS 3 and FSR 3, called ExtraSS (via WCCFTech). Detailed at Siggraph Asia in Sydney, ExtraSS is not just a clone of DLSS 3 or FSR 3: instead of using frame interpolation, it uses frame extrapolation. Although these two methods are very similar, extrapolation has some advantages that could set ExtraSS apart from its competitors.

On the whole, ExtraSS is pretty similar to DLSS's and FSR's respective frame generation technology. Intel has built on top of XeSS and makes use of motion vectors and spatial data to improve visual quality, but with extrapolation, the data used to make a new frame is very different. Instead of using two frames to create a new one to insert in between (that's the inter in interpolation), extrapolation takes just one frame to generate the new one.

The obvious disadvantage here is the lack of extra data to put into Intel's algorithm. Extrapolation requires a higher resolution input and could still result in lots of visual glitches and artifacts, as Intel admits in its white paper. However, the benefit is that there is a reduced latency penalty compared to interpolation, which has to delay frames so it can generate new ones (otherwise, they'd show up out of order).

The reason why Intel has decided to go with extrapolation when its rivals chose interpolation is because the company has "a new warping method with a lightweight flow model," which apparently makes extrapolation more feasible. However, we don't have footage of the demo (which was done in Unreal Engine), so we can't properly evaluate how good ExtraSS is, at this time.

The key drawback of DLSS 3 and FSR 3 has been the big latency penalty, which practically needs to be mitigated by Nvidia's Reflex and AMD's Anti-Lag technologies, respectively. ExtraSS, if its visual quality is good enough, could be a potent alternative since it doesn't depend on additional features to get its latency to a playable level. However, that's a pretty big if, considering DLSS 3 and FSR 3 already struggle in visuals, and they use a higher quality frame generation method.

Matthew Connatser

Matthew Connatser is a freelancing writer for Tom's Hardware US. He writes articles about CPUs, GPUs, SSDs, and computers in general.

  • coolitic
    So they basically made their own implementation of Blur Busters method that BBs was trying to get everyone to adopt: https://blurbusters.com/frame-generation-essentials-interpolation-extrapolation-and-reprojection/
    Reply
  • Order 66
    Interesting, frame generation needs lower latency, and I guess this is one of the only ways to decrease latency in the whole process of generating the frames. I can’t wait to see what the latency looks like.
    Reply
  • vanadiel007
    Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

    I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
    Reply
  • DougMcC
    vanadiel007 said:
    Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

    I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
    The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
    Reply
  • coolitic
    DougMcC said:
    The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
    Yeah right... One of the big issues is that game engines like Unreal are made to let artists just drop stuff in, in a wide variety of silly formats and setups, without optimization. Artists and designers are overly-prioritized in many workflows, and programmers hardly deal with the low-level guts of things anymore.

    Another big issue is that "modern" programming paradigms of the past two decades have been the death of hardware-centric optimization.
    Reply
  • Alvar "Miles" Udell
    But think about applications where the highest visual fidelity isn't paramount, such as consoles (and handhelds like the Steam Deck) and iGPU setups where power constraints mean max detail levels aren't possible. Having a more streamlined technique is the better way to go.
    Reply
  • coolitic
    Alvar Miles Udell said:
    But think about applications where the highest visual fidelity isn't paramount, such as consoles (and handhelds like the Steam Deck) and iGPU setups where power constraints mean max detail levels aren't possible. Having a more streamlined technique is the better way to go.
    There's nothing inherently wrong with the technique, esp for the use-case that BlurBusters detailed, which is effectively boosting a base 100 fps game to very high refresh rates. It's just that it unfortunately also doubles as an excuse for increasingly-lazy devs to continue being so.
    Reply
  • thestryker
    vanadiel007 said:
    Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

    I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
    While I don't disagree that we need better hardware we absolutely need good frame generation technology. Right now the way nvidia has been pushing frame generation as a universal improvement is toxic and bad for everyone. You need a good frame rate to start with before frame generation is a good technology.

    The reason I say it's necessary though is high refresh rates are a big deal, but there's no hardware that can truly drive them. We're going to have 360hz 4k screens soon and we're probably 3+ generations away from being able to do that even at the highest end hardware. If we can get $300 GPUs up to 90-120 FPS in 4k frame generation can start to pickup the slack to great advantage for everyone.
    Reply
  • d0x360
    vanadiel007 said:
    Nothing beats horsepower. Replacing the horses with goats is not going to work. Hopefully the industry keeps working on improving both soft and hardware to provide better frame rates in the future.

    I am afraid these frame boosting techniques will lead to more sloppy software, sloppier programming, and lower improvements in hardware.
    Unfortunately (or not) it's a technology that's only going to become more prevalent especially when the console upgrades hit and it's a core part of their strategy along with ai enhanced upscaling.

    The good news is the tech is constantly improving image quality wise and as long as your hitting a native refresh that's roughly half of the output then latency isn't really an issue in all but the most twitchy of twitch shooters with low TTK. Latency reduction technology essentially acts as a frame rate limit so it's actually not so bad.

    The problems really start showing when a game is running at native 30 and even going to 60 which goes against the half native I talked about but 30 is just too low. 40 would work at 60 output because 40 is exactly way to 60 half in terms of frame time which is more important but anything beyond 60 from 40 and you have issues in basically everything.

    I play PC games on my LG C1 so I run a 117 fps cap and I have a 4090 so usually my native frames rate is 60+. Most games at the upper end of demanding run between 60-80fps then get boosted to 100-117.

    I've actually been having quite a bit of fun using the dlss3 frame gen injection mods that replace fsr or even "upgrade" standard dlss2 so they support frame gen. Pray dog just released a gen for red dead 2 which works nicely and the one for dead space remake is fantastic. It fixes that games stutter as does the one for Jedi survivor which oddly works better than the official games implementation of frame gen using dlss3.
    Reply
  • watzupken
    DougMcC said:
    The notion that a lot of 'sloppy' software is responsible for slow frame rates is wildly off base. The realism of rendering has been marching steadily upward for years, that is the root cause. Programmers aren't getting worse, their job is getting harder.
    I beg to differ here. It is true that the realism of rendering have gone up. But the question is, do you really need it to be so realistic looking? These folks are selling a game, not some art pieces. Look at all the AAA titles that supposedly come up with UE 5 "realism". More than often, they fall flat because the requirements to run the game is simply too high. Which genius decides that they should shut out 70% of the gamers by imposing super high graphic requirements? I feel this is a self inflicted problem. Does Baldur's Gate look super good? I doubt that, but yet it sells like hot cakes. I just feel game developers in general have looked away from making a fun game, to milking an old franchise or title, and focusing on the visuals to distract people from the fact that it is a boring game.
    Reply