Intel Gen11 Graphics Will Support Integer Scaling

Intel Vice President Lisa Pearce has revealed via a video tweet that Intel will add integer scaling support in the end of August. The bad news is that only Intel processors with Gen11 and newer graphics solution will have access to the feature.

Credit: IntelCredit: Intel

Modern monitors are constantly pushing the resolution limit. Unfortunately, not all games, especially older titles, were designed to work with higher resolutions. The most common example is when you try to play a game that only supports the 1280x720 resolution on a 4K gaming monitor (3840x2160). Although there are multiple techniques to scale the image up to the monitor's native resolution, the result isn't always pleasant.

Credit: Marat Tanalin/tanalin.comCredit: Marat Tanalin/tanalin.com

AMD and Nvidia employ bilinear or bicubic interpolation methods for image scaling. However, the final image often turns out blurry, too smooth or too soft. The integer scaling method should help preserve sharpness and jagged edges. Pixel-graphic game and emulator aficionados will surely love the sound of that. The benefits of integer scaling extend far beyond that though, as it opens the door for you to play old games at their native resolution with lossless scaling on high-resolution monitors.

Despite constant petitions from AMD and Nvidia graphics cards owners, neither chipmaker has implemented the integer scaling technique up to this point. So Intel definitely deserves the props for listening to users and taking the first step to add this feature. However, integer scaling is only compatible with Intel processors with Gen11 graphics. The closest candidate is Ice Lake, which should land before the year ends.

According to Pearce, the older Intel parts with Gen9 graphics lack hardware support for nearest neighbor algorithms. While Intel did explore the possibility of a software solution, the chipmaker felt it wasn't viable due to the performance hit. Intel expects to have integer scaling up and running by the end of August. You can enable and disable the feature through the Intel Graphics Command Center software as long as you have a compatible processor.

17 comments
    Your comment
  • koblongata
    Finally... Pixel perfect 1080P content for 4K displays... Been asking for it for years...

    Consider that 10nm Intel integrated graphics could do 1080P normal settings for most new games just fine, AND playback 4K HDR videos just fine, no rush, or even no need for a seriously expensive gaming rig for my 4K TV anymore...
  • joeblowsmynose
    So this gives more jagged edges while upscaling? That doesn't sound useful ... at all! Unless you need a pixelated image - like the one they posted in the article? I don't get it ....

    Edit: I guess if smoothing was applied after the upscaling ... but wouldn't tat defeat the feature? Someone explain, maybe I'm just missing something ...
  • cryoburner
    Quote:
    I don't get it ....

    It's so that you can run a lower resolution that is evenly divisible by your screen's native resolution, while looking more like that resolution is the native resolution of the display. It effectively makes the pixels larger without needlessly blurring them in the process. This is arguably preferable for keeping the image looking sharp. Without integer scaling, if you run a game at 1080p on a 4K screen, for example, it will typically look a bit worse than if you were to run it on a 1080p screen of the same size, since pixels are bleeding into one another during the upscaling process. That blurring is necessary for resolutions that can not be evenly divided into your screen's native resolution, but for those that can, you are probably better without it.

    Unlike the example image, the pixels are still going to be relatively small if you are running, for example, 1080p on a 4K screen. You can still use anti-aliasing routines as well, but you won't have everything getting blurred on top of that when the image is scaled to fit the screen.