Nvidia AI tech claims to slash gaming GPU memory usage by 85% with zero quality loss — Neural Texture Compression demo reveals stunning visual parity between 6.5GB of VRAM and 970MB

Nvidia's Neural Texture Compression
(Image credit: Nvidia)

As games become more complex and photorealistic, the industry has increasingly relied on upscaling technology to meet surging hardware demands. One of the biggest issues arising from this subpar optimization is VRAM usage, which has risen sharply over the past few years. To combat this, Nvidia has developed a technology called "Neural Texture Compression" (NTC), which was brought up again in today's GTC talk. The best graphics cards will be able to leverage Nvidia's NTC technology.

In the example below, Nvidia ran a Tuscan Villa Scene that was consuming 6.5 GB of VRAM with standard block compression, but switching to NTC reduced that to just 970 MB, and the image looks identical. Previously, another demo from the company showed a flight helmet with 272 MB of uncompressed textures — block compression cut that down to 98 MB, but NTC reduced it to just 11.37 MB, about 24x less than the original.

Article continues below
Introduction to Neural Rendering - YouTube Introduction to Neural Rendering - YouTube
Watch On

The company also demonstrated Neural Materials, following the same concept: letting a neural network evaluate and decompress material texture data instead of relying on computationally expensive BRDF math. Typically, multiple texture maps are stacked for a material, and the GPU must calculate how light interacts with each layer simultaneously in the rendering pipeline.

Neural Materials just asks the neural network how the light will react in that scenario and shades the pixel accordingly. The neural network is trained on all the texture data, so it already knows the result given the light and angle. As such, in the demo scene below, Nvidia achieved up to 7.7x faster render times at 1080p resolution with no loss in image quality.

NTC is so efficient because it uses matrix acceleration engines, which are a separate hardware block in modern GPUs, so base performance isn't affected. Nvidia calls them Tensor Cores, Intel calls them XMX engines, and AMD calls them AI accelerators. This is where upscalers like DLSS, FSR, and XeSS also live, as they reconstruct a low-res frame into a higher-resolution output, so it's part of Nvidia's neural rendering ambition.

The concept of neural rendering doesn't have the widest acclaim in the community yet, and the word "neural network" might lead you to think this is just more AI slop. It's actually the opposite, and one of the better uses of AI since it's not generative at all. NTC will be trained only on the specific set of textures it needs to reference during game development, so there's no chance of hallucination.

Textures by far consume the most VRAM in any game, so any technique to keep them in check is a welcome addition. That said, it's important to note that this isn't exclusive to Nvidia, as Microsoft has standardized it as "Cooperative Vectors" in DirectX. Intel has previously shown off its own demo with noticeably better textures compared to block compression. AMD last talked about the tech in 2024, but it's likely onboard the mission as well.

Currently, no game supports Cooperative Vectors or Nvidia's Neural Texture Compression, but we should start to see it implemented soon, given the industry's trajectory. AI has become the answer to seemingly every age-old problem, and corporations are inventing new ways to incorporate it where it doesn't belong. Innovations like NTC, however, show that it can be implemented tastefully to make an actual, meaningful difference.

Google Preferred Source

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

TOPICS
Hassam Nasir
Contributing Writer

Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

  • beyondlogic
    Admin said:
    Nvidia has just demoed its Neural Texture Compression technique again at a GTC talk, where it showed VRAM usage dropping from 6.5 GB to just 970 MB in a scene. NTC uses a neural network to decompress textures instead of standard block-based compression, reducing texture size and VRAM usage while also improving final image quality.

    Nvidia AI tech claims to slash VRAM usage by 85% with zero quality loss — Neural Texture Compression demo reveals stunning visual parity between 6.... : Read more

    its decent but i get the feeling moving objects and scenes are going to breakdown.

    even the plate shown has a blurry outlook due to lack of ram.

    this is just going to give gpu vendors a excuse for lower memory cards and most likely see more horrid looking games. lol.

    is it good yes can it be used for bad yes lol.
    Reply
  • vanadiel007
    So this is how they are planning to keep the profit margins up during this RAM shortage: a new compression technology that only needs 1 Gig instead of 8 Gig VRAM to provide you those same great graphics.
    Reply
  • usertests
    Some form of neural texture compression is already confirmed to be coming to Xbox Helix, and is highly likely to be used by the PS6.

    So if you have concerns about this, too bad, it's inevitable.

    Will it be used by Nvidia and AMD to skimp out on VRAM and offer more low VRAM cards? Possibly. But if the next-gen consoles include around 24-36 GB of memory, 8-12 GB cards won't cut it forever.
    Reply
  • warezme
    Will this work on video streaming? Imagine the boon to network providers if streaming services for apps like Netflix, Hulu, etc. requiring a fraction of the bandwidth to produce higher resolution streams.
    Reply
  • psyconz
    beyondlogic said:
    its decent but i get the feeling moving objects and scenes are going to breakdown.

    even the plate shown has a blurry outlook due to lack of ram.

    this is just going to give gpu vendors a excuse for lower memory cards and most likely see more horrid looking games. lol.

    is it good yes can it be used for bad yes lol.
    It's a lossless compression technique. Zero image quality loss. Not all AI is garbage trained on the internet...
    Reply
  • thestryker
    This is the technology that shows the most promise of the ai based things nvidia has put forth. As long as nvidia's implementation isn't proprietary to the point that developers need separate implementation for other vendors I could see this getting adoption. I'm also curious if it has to be used for everything being rendered or if it can work in conjunction with existing techniques.
    Reply
  • timsSOFTWARE
    beyondlogic said:
    its decent but i get the feeling moving objects and scenes are going to breakdown.

    even the plate shown has a blurry outlook due to lack of ram.

    this is just going to give gpu vendors a excuse for lower memory cards and most likely see more horrid looking games. lol.

    is it good yes can it be used for bad yes lol.
    I doubt it will actually be "zero quality loss", but it may very well be a lot better than using a lower mip/lower resolution. People with lower-end cards will benefit the most, if it works on their hardware.
    Reply
  • usertests
    timsSOFTWARE said:
    I doubt it will actually be "zero quality loss", but it may very well be a lot better than using a lower mip/lower resolution. People with lower-end cards will benefit the most, if it works on their hardware.
    It will be interesting to see this play out, and how old you can go.

    I think it would likely work on at least the RTX 5050 and 9060 XT 8GB. But AMD's version coming to next-gen consoles would be running on RDNA5. AMD users have been slapped in the face when it comes to FSR4, and it can happen again with this.
    Reply
  • QuarterSwede
    psyconz said:
    It's a lossless compression technique. Zero image quality loss. Not all AI is garbage trained on the internet...
    Then the examples they gave aren’t indicative of this technique since you can clearly see quality loss …
    Reply
  • -Fran-
    You only need 640KB of memory. We all know that.

    Regards.
    Reply