Sign in with
Sign up | Sign in

New Encoding Tech May Bring Billions of Color to Blu-ray

By - Source: Folded Space | B 21 comments

One company is looking to bring 12-bit color encoding to Blu-ray.

Earlier this week, a company called Folded Space said that it has created encoding/decoding algorithms that will bring deeper color to media such as movies and TV shows, which are filmed using high dynamic range digital cameras. This set of algorithms is called DCE (Deep Color Encoding), and will process original content with 12 bits per color instead of the current 8 bits.

"With DCE, studios can now release Blu-ray discs and even next generation UHD/4K physical media to support what's commonly considered to be the most important, most visual improvement in next generation video," said John Schuermann, who leads business development for Folded Space.

Although the wave of 4K UHD TVs, Blu-ray players and other equipment is beginning to saturate the market, most of the media that will play on these devices are encoded with 8 bits per color channel. When you look at the media up close and personal, it's easy to see color "banding," or rather a failed attempt to blend several colors together.

That said, encoding media in 10-bit, 12-bit and even 16-bit will eliminate banding because there are billions of colors to use. Folded Space's DCE can do this while keeping the content roughly the same size as the 8-bit version, and while preserving backwards compatibility. That means one Blu-ray disc will cough up a movie that works on 8-bit and 12-bit players.

Of course the big drawback is getting the industry to accept the new technology. This would not only apply to the media burned on a Blu-ray disc, but the optical drives that read them as well. That means devices would need to be on the Internet to receive a firmware update that supports the algorithms.

"The company's proprietary yet simple and fast algorithms process original content with 12-bits per color and imperceptibly encode information about the fine color detail into a standard, backward compatible 8-bit Blu-ray disc," reads the company's press release. "Newer displays and Blu-ray players with the decoding algorithm can then restore a 12-bit equivalent of the original image in support of much greater color range of recently announced displays."

Folded Space plans on licensing the encoding algorithm to software partners free of charge to "stimulate" deep color and high dynamic range content production as soon as possible. The company also plans to license the decoding algorithm to player and display partners for a "modest" fee.

For more information about Folded Space and DCE, head here.

Display 21 Comments.
This thread is closed for comments
  • -5 Hide
    xenol , January 27, 2014 9:04 AM
    Either I don't care or I don't see it, but color banding never really seemed to bother me. I would think that the film editor would be cautious about scenes that may produce banding as well...
  • 6 Hide
    InvalidError , January 27, 2014 11:12 AM
    You can already have 10bits with plain old h264 and streams encoded in "hi10" often end up smaller for a given quality due to reduced pixel noise/errors.

    While banding with 8bits usually is not too bad, there are times where it can be distracting enough to make me wish 10bits was more common.
  • 7 Hide
    Kewlx25 , January 27, 2014 11:14 AM
    Quote:
    Either I don't care or I don't see it, but color banding never really seemed to bother me. I would think that the film editor would be cautious about scenes that may produce banding as well...


    With that attitude, lets just go back to black and white to save space.
  • 2 Hide
    Nolonar , January 27, 2014 11:15 AM
    Banding is especially noticeable in dark scenes.
    Normally, movies try to avoid banding using dithering. This is particularly bad for compression of animations (cartoons, if you wish), where there's usually huge blocks of single-tone colors, or very regular gradients that are easy to compress.

    I certainly look forward to the day the old 8 bit encoded video gets replaced by something newer, preferably 16 bit, since it's a power of 2 ;) 
  • -4 Hide
    Grandmastersexsay , January 27, 2014 12:02 PM
    "When you look at the media up close and personal, it's easy to see color "banding," or rather a failed attempt to blend several colors together."

    No it is not. I have never once noticed any banding.The large dark blocks the previous poster mentioned has nothing to do with banding. That is simply poor compression, end of discussion.
  • 1 Hide
    Blazer1985 , January 27, 2014 12:11 PM
    8bits are not a great limitation for content consumption, 10 or more bits are important for heavy color correction (content creation). Banding has more to do with the heavy compression, black banding with both the compression and the tv/monitor you use.
  • -5 Hide
    Draven35 , January 27, 2014 7:11 PM
    12 bits is hardly high dynamic range... try 32 bits.
  • 2 Hide
    Blazer1985 , January 27, 2014 7:40 PM
    Anyway don't forget that we are talking about 8 bit per channel = 32 bit = what your display can actually display and more than your eyes can tell :-)
  • 1 Hide
    Draven35 , January 27, 2014 7:44 PM
    Quote:
    Anyway don't forget that we are talking about 8 bit per channel = 32 bit = what your display can actually display and more than your eyes can tell :-)


    Depends on your display, some displays can show 30-bit... and a portion of people can see better than 256 shades of grey and thus see more banding. A portion of these people work in graphic arts...
  • 2 Hide
    loosescrews , January 27, 2014 9:04 PM
    10 bits per channel seems like the logical progression. There is already content being produced and distributed in 10 bits per channel color (often called hi10 as mentioned by InvalidError) and displays are currently being sold that can display it. My two Dell U2711 monitors can handle it and I got them in 2010. Besides, I suspect that the current 8 bits per channel color was selected due to diminishing returns kicking in after that point.
  • 1 Hide
    Blazer1985 , January 28, 2014 2:52 AM
    On still images you may be right. But 99% of what you see in television in encoded by the camera in 4.2.0 or 4.2.2 This means only one colour information every 4 or 2 pixels. And don't even try telling me you can catch them at 25/30 fps :-D
  • -3 Hide
    ddpruitt , January 28, 2014 5:33 AM
    8 bits per channel is 6 times more color than humans can perceive. The problems with banding aren't the number of bits, rather how those colors are encoded into bits. Don't believe me? Check out how some H.264 encoders screw up darks given the same bitrate as the original.
  • 2 Hide
    InvalidError , January 28, 2014 5:38 AM
    Quote:
    And don't even try telling me you can catch them at 25/30 fps :-D

    The biggest problem with 8bits per plane, bending and video encoding is that at 8bits, encoders cannot tell the difference between 1-2LSB differences in white/black/color levels and noise or they drop details of that magnitude because they require too much bandwidth to encode for what little detail they seem to provide so you end up with 6 effective number of bits (ENUB) and that ends up greatly magnifying the blotching in shadows and highlights. With two extra bits, the encoder can tell the difference between noise and details that much better and encodes end up looking cleaner with a full ~8 ENUB output from 10bits source material.
  • 4 Hide
    InvalidError , January 28, 2014 5:53 AM
    Quote:
    8 bits per channel is 6 times more color than humans can perceive.

    Try looking at a 8x8 full-screen checkerboard pattern displaying all 256 shades of individual colors. The steps between individual shades is quite obvious on a still image so the human eye can certainly discern more than 8bits per channel worth of information when presented in a way that maximizes perception.

    This is why image and video encoders rely on human perception models to decide which details can be dropped without affecting perceived image quality too much.
  • -1 Hide
    amdfangirl , January 28, 2014 7:08 AM
    How can you have "Billions of Color"?
  • 1 Hide
    InvalidError , January 28, 2014 7:37 AM
    Quote:
    How can you have "Billions of Color"?

    10 bits per channel x 3 channels (R+G+B) = 30bits. 2**30 = 1.07 billion.
    With 12bits per channel, you have 36bits total which is 68.8 billions.

    GPUs have been processing images in 16-32 bits per plane (high dynamic range) for several years already but displays capable of more than 8bpp are still uncommon... some high-speed panels are still using 6bpp with dithering.
  • 3 Hide
    amdfangirl , January 28, 2014 8:10 AM
    I'm pointing out a grammatical mistake :p .
  • 2 Hide
    mapesdhs , January 28, 2014 9:59 AM
    Kevin, I assume you meant 12 bits per channel, not 12 bits per colour.


    Btw, SGI was doing 48bit RGBA more than 20 years ago. It's nothing new.
    Only the switch to flat panels has meant the fidelity of perceived images
    has gone down since then; in the past, CRTs could show the 4096 different
    shades of each colour just fine with 36bit RGB; 12bits for Alpha has always
    been important for accurate low-light/contrast visual simulation and precision
    imaging/video, while 16bit to 64bit greyscale has been used in medical imaging,
    GIS and defense imaging analysis for ages aswell, eg. the Group Station for
    Defense Imaging was dealing with 60GB 2D images probably a decade before
    many of the readers of this forum were even born. :D 

    Modern display tech is merely catching up to what high-end gfx was doing
    in the last century with CRTs. *yawn*

    Ian.

  • 0 Hide
    annymmo , January 28, 2014 11:34 AM
    Another supporter of moving to 16 bit per channel.Their tactic of adopting sounds a bit like a patent trap.This greed and the need to get money makes me think there should be more technology where the inventors just get some kind of prize resources and money instead of having to bring it to market the current way. This protectionism is really bad for technological progression.
  • 1 Hide
    Draven35 , January 28, 2014 12:29 PM
    Quote:
    On still images you may be right. But 99% of what you see in television in encoded by the camera in 4.2.0 or 4.2.2 This means only one colour information every 4 or 2 pixels. And don't even try telling me you can catch them at 25/30 fps :-D


    Most stuff you see on TV was encoded by the camera in 4:2:2. Most news footage is 4:1:1 or 4:2:0 depending on the format they use. But id doesn't matter because the encode for digital broadcast is 4:2:0. Most dramatic series are shot in 4:4:4 or some even still on film.
Display more comments