H.266 VCC Video Codec Promises to Cut File Sizes in Half

(Image credit: Shutterstock)

Fraunhofer HHI (Heinrich Hertz Institute) today announced the new H.266 codec, also known as the Versatile Video Coding (VVC) codec. The H.266 standard is the result of three years of work from Fraunhofer HHI and its partners.

With an enhanced compression algorithm, the H.266 codec  promises to maintain the visual quality of the current H.265 codec, also known as High Efficiency Video Coding (HEVC), but at an about 50% smaller file size. That would make H.266 perfect for efficiently transporting high-resolution content, such as 4K or 8K media, through mobile networks. 

Furthermore, the H.266 codec also supports HDR content and 360-degree videos.

Let's put this in perspective. A 90-minute 4K video consumes up to 10GB of space with the current H.265 codec. According to Fraunhofer HHI's figures, the same video with an identical level of quality would only require 5GB of space with H.266. The transition from H.265 to H.266 could indeed be a quantum jump if the latter can deliver on its promise. 

In its press release, Fraunhofer HHI said that the CPUs required for H.266, especially the mobile chips, are still in development. This suggests that content used with H.266 may require a hefty amount of firepower to encode or decode. However, encoding might not be a big deal if you're not a content creator, since most are more likely to consume H.266 content than create it.

Dr. Thomas Schierl, head of the Video Coding and Analytics department at Fraunhofer HHI, said that "this autumn Fraunhofer HHI will publish the first software (for both encoder and decoder) to support H.266/VVC." 

Zhiye Liu
News Editor and Memory Reviewer

Zhiye Liu is a news editor and memory reviewer at Tom’s Hardware. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • mwestall
    MKV to MP4 isn't transcoding, encoding or any sort of coding, it's just demuxing/remuxing into a new container. An encode is from the uncompressed or lightly compressed original into a more compressed version. Which takes hours.
    Interns writing copy today?
    Reply
  • JarredWaltonGPU
    mwestall said:
    MKV to MP4 isn't transcoding, encoding or any sort of coding, it's just demuxing/remuxing into a new container. An encode is from the uncompressed or lightly compressed original into a more compressed version. Which takes hours.
    Interns writing copy today?
    I assume someone must have already edited this to correct the error? But while MKV and MP4 are just containers, often you'll transcode to a lower bitrate in the process -- which is the real point. So if you have an MP4 captured at 50 Mbps and you want to convert that to a 16 Mbps MP4 (eg, for YouTube), you're doing a transcode. You could get better quality (with a slower transcode) going to a ~10 Mbps H.265 format, and now apparently a 5 Mbps H.266 format will also deliver approximately the same quality.

    I'm super curious to see what sort of hardware requirements will exist for decoding and encoding of H.266. I remember when H.264 was brand new and couldn't be decoded at anything close to real-time on most PCs. Then the hardware and software caught up. Then H.265 did it all again, and now H.266 looks to be repeating things. Of course, lots of stuff is still in H.264 because it's 'universally' supported these days. Plenty of PCs and smartphones still have issues with H.265 decoding.
    Reply
  • InvalidError
    mwestall said:
    An encode is from the uncompressed or lightly compressed original into a more compressed version.
    Although one would typically want to re-encode from as close to the original source as possible for the highest quality, a 3rd or 4th-gen re-encode for whatever purposes (ex.: squeezing hours of video on on-board storage for the kids' tablets) is still an encode too.

    JarredWaltonGPU said:
    I'm super curious to see what sort of hardware requirements will exist for decoding and encoding of H.266.
    I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
    Reply
  • JarredWaltonGPU
    InvalidError said:
    I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
    I haven't looked to see what's being done, but while the math may be similar in some aspects, there are usually some very computationally intensive sections. I'm sure most modern PCs with a good GPU and CPU will be fine decoding H.266. The real question is going to be stuff like laptops with integrated graphics. If the decoding complexity is four times higher (which isn't too unrealistic), anything prior to Ice Lake might come up short. I guess we'll see, and most likely H.266 won't see widespread use for many years -- at least if it's anything like HEVC/H.265. Most streaming videos are still using H.264 (or VP9 or some other codec) rather than H.265 AFAICT.
    Reply
  • InvalidError
    JarredWaltonGPU said:
    The real question is going to be stuff like laptops with integrated graphics. If the decoding complexity is four times higher (which isn't too unrealistic), anything prior to Ice Lake might come up short.
    I did a h265/4k decode test on my i5-3470 a while ago and could simultaneously decode five files in software before things broke down. I think almost anything newer with quad-cores should be fine with h266, especially if any sort of (I)GPU acceleration is available.
    Reply
  • rickstockton
    This appears to be a big improvement in compression, while claiming to maintain high visual quality. But, will the license terms be 'free' to use and re-implement? If it's not as "open" as AV1, then people like me will not be able to use use it. Some content creators have extremely limited abilities to pay for software and hardware upgrades. And some kind of VAAPI-equivalent hardware encoding is nearly mandatory for those of us without gigantic server farms of CPUs to encode content.

    I am personally retired with limited income, and I create and transcode videos for a non-profit. I currently depend on ffmpeg as my primary encoder. If the software for encoding videos in this algorithm isn't at least as efficient as VP9, while remaining encodeable on a desktop computer with a mid-grade video card - than I won't be able to go there.
    Reply
  • Makaveli
    This is an interesting topic for sure.

    My place has a mix of hardware my Plasma tv can decode h264 but not h265.

    I have an android box that can do both since its newer.

    I would love to see the comparison of all 3 codecs in the future.
    Reply
  • Kamen Rider Blade
    H.266 has the technical side down along with the compression ratio IMO.

    The Licensing is the more interesting part of the story at the moment.

    Especially with the Free Codec option being better than H.264 in terms of compression ratio, but not quite the best of H.265 while maintaining video quality closer to H.265.

    I just hope they get the licensing aspect of the primary H.266 standard down pat and not be a entire <Mod Edit> like H.265.

    And we need all the browsers to jointly work on putting JPEG XT into implementation and start getting JPEG XT into all the Image Editors / Viewers.

    That can dramatically cut Image file sizes while retaining quality.
    Reply
  • nofanneeded
    This is Great news for TV channels all moving to 4K in the near future ... when you need low bandwidth to broadcast .. also for SAT Channels where fast internet is hard to find while the SAT link is low speed. and for ships in the sea as well ..
    Reply
  • cryoburner
    InvalidError said:
    I'm not particularly worried about decoding, I suspect most of the math is just more refined variants of existing algorithms and current hardware will be able to accelerate most of it just like how old hardware could accelerate most of h264 and h265 before full-blown hardware decoders became mainstream. Encoders is where the real challenge is.
    If the format can offer similar image quality to H.265 at half the file size, I suspect one of the other formats would already be doing something similar, unless there was some catch. My guess is that they are not because the decoding performance demands become far higher. That, or the image quality isn't actually comparable at that level of compression.

    Kamen Rider Blade said:
    And we need all the browsers to jointly work on putting JPEG XT into implementation and start getting JPEG XT into all the Image Editors / Viewers.

    That can dramatically cut Image file sizes while retaining quality.
    While support for more efficient image formats might be nice, there's arguably much less of a need there. Today's internet connections are mostly fast enough, and server costs are cheap enough, where the bandwidth required for downloading images is generally not much of a concern. The same goes for storage, where hundreds of high-resolution images can be stored for a few cents. With video, the file sizes tend to be far higher, so there can be much more benefit from moving to a new format.

    And most importantly, the standard JPEG format is pretty much universally supported in software and on computing devices stretching back a couple decades. And for lossless web images, PNG is widely supported, at least for software and devices from the last decade or so. Software doesn't tend to change overnight to support the newest image formats, and if a format isn't already widely in use, there's even less incentive for developers to devote resources to supporting it. The PNG format came out in the mid-90s, as a much improved and open alternative to GIF, but didn't really receive full, proper support in all major web browsers until close to 15 years later, despite there not really being any alternative for lossless web images.

    JPEG XT offers some additional features like HDR support, but HDR-capable hardware is still not all that widespread, and HDR images are relatively rare. So there arguably isn't that much need for those new features at the moment. And of course, there have been lots of other file formats pitched as being "JPEG successors" over the years that haven't really taken off. JPEG 2000, JPEG XR, WebP, HEIF and so on. And even if a format gets implemented properly in all major web browsers, most will likely stick to the older formats for some time, as there are lots of older devices that won't be compatible.
    Reply