The developer of CapFrameX, a famous capture, and analysis tool, has published a performance comparison (opens in new tab) of AMD's Radeon RX 6800 XT (RDNA 2), Intel's Arc A770 (Alchemist), and Nvidia's GeForce RTX 3090 (Ampere) and GeForce RTX 4090 (Ada Lovelace) in AV1 video encoding in at 8K resolution. The results he obtained are pretty surprising.
The best graphics cards, like Nvidia's GeForce RTX 3090 or RTX 4090, offer formidable performance in demanding games in high resolutions, unlike Intel's Arc A770, which targets mainstream gamers. But when it comes to high-resolution video playback, everything comes down to video decoding hardware performed by a special-purpose hardware unit whose performance does not depend on the overall capabilities of the GPU.
To test the decoding capabilities of modern graphics processors, the developer of CapFrameX took Japan in 8K 60 FPS (opens in new tab) video from YouTube and decoded it in 4K and 8K resolutions in Chrome browser. Ideally, all GPUs would need to deliver constant 60 FPS; those drops in 0.2% – 1% of cases should not ruin the experience.
In 8K, Intel's Arc A770 delivers smooth playback with 60 FPS on average and drops to 44 FPS in 0.2% of cases. By contrast, Nvidia's GPUs hit 56.8 – 57.6 FPS and fell to 16.7 FPS at times, making it uncomfortable to watch. Given how close the results of Nvidia's AD102 and GA102 are, we can only wonder whether Nvidia's current driver and Google Chrome can take advantage of Nvidia's latest NVDEC hardware (and whether Nvidia updated the hardware when compared to GA102). With AMD's Radeon RX 6800 XT, the 8K video was 'unwatchable' as it suffered from low frame rates and stuttering.
In general, Intel offers industry-leading video playback support even at 8K. By contrast, Nvidia's software may need to catch up with Intel's regarding high-resolution AV1 video playback. While AMD's Navi 21 formally supports AV1 decoding, it does not look like an 8K resolution and is a nut that it can crack, at least with current software.