New Encoding Tech May Bring Billions of Color to Blu-ray

Status
Not open for further replies.

xenol

Distinguished
Jun 18, 2008
216
0
18,680
Either I don't care or I don't see it, but color banding never really seemed to bother me. I would think that the film editor would be cautious about scenes that may produce banding as well...
 

InvalidError

Titan
Moderator
You can already have 10bits with plain old h264 and streams encoded in "hi10" often end up smaller for a given quality due to reduced pixel noise/errors.

While banding with 8bits usually is not too bad, there are times where it can be distracting enough to make me wish 10bits was more common.
 

Kewlx25

Distinguished


With that attitude, lets just go back to black and white to save space.
 
D

Deleted member 1353997

Guest
Banding is especially noticeable in dark scenes.
Normally, movies try to avoid banding using dithering. This is particularly bad for compression of animations (cartoons, if you wish), where there's usually huge blocks of single-tone colors, or very regular gradients that are easy to compress.

I certainly look forward to the day the old 8 bit encoded video gets replaced by something newer, preferably 16 bit, since it's a power of 2 ;)
 

Grandmastersexsay

Honorable
May 16, 2013
332
0
10,780
"When you look at the media up close and personal, it's easy to see color "banding," or rather a failed attempt to blend several colors together."

No it is not. I have never once noticed any banding.The large dark blocks the previous poster mentioned has nothing to do with banding. That is simply poor compression, end of discussion.
 

Blazer1985

Honorable
May 21, 2012
206
0
10,690
8bits are not a great limitation for content consumption, 10 or more bits are important for heavy color correction (content creation). Banding has more to do with the heavy compression, black banding with both the compression and the tv/monitor you use.
 

Blazer1985

Honorable
May 21, 2012
206
0
10,690
Anyway don't forget that we are talking about 8 bit per channel = 32 bit = what your display can actually display and more than your eyes can tell :)
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010


Depends on your display, some displays can show 30-bit... and a portion of people can see better than 256 shades of grey and thus see more banding. A portion of these people work in graphic arts...
 

loosescrews

Honorable
Jul 4, 2013
190
0
10,760
10 bits per channel seems like the logical progression. There is already content being produced and distributed in 10 bits per channel color (often called hi10 as mentioned by InvalidError) and displays are currently being sold that can display it. My two Dell U2711 monitors can handle it and I got them in 2010. Besides, I suspect that the current 8 bits per channel color was selected due to diminishing returns kicking in after that point.
 

Blazer1985

Honorable
May 21, 2012
206
0
10,690
On still images you may be right. But 99% of what you see in television in encoded by the camera in 4.2.0 or 4.2.2 This means only one colour information every 4 or 2 pixels. And don't even try telling me you can catch them at 25/30 fps :-D
 

ddpruitt

Honorable
Jun 4, 2012
1,109
0
11,360
8 bits per channel is 6 times more color than humans can perceive. The problems with banding aren't the number of bits, rather how those colors are encoded into bits. Don't believe me? Check out how some H.264 encoders screw up darks given the same bitrate as the original.
 

InvalidError

Titan
Moderator

The biggest problem with 8bits per plane, bending and video encoding is that at 8bits, encoders cannot tell the difference between 1-2LSB differences in white/black/color levels and noise or they drop details of that magnitude because they require too much bandwidth to encode for what little detail they seem to provide so you end up with 6 effective number of bits (ENUB) and that ends up greatly magnifying the blotching in shadows and highlights. With two extra bits, the encoder can tell the difference between noise and details that much better and encodes end up looking cleaner with a full ~8 ENUB output from 10bits source material.
 

InvalidError

Titan
Moderator

Try looking at a 8x8 full-screen checkerboard pattern displaying all 256 shades of individual colors. The steps between individual shades is quite obvious on a still image so the human eye can certainly discern more than 8bits per channel worth of information when presented in a way that maximizes perception.

This is why image and video encoders rely on human perception models to decide which details can be dropped without affecting perceived image quality too much.
 

InvalidError

Titan
Moderator

10 bits per channel x 3 channels (R+G+B) = 30bits. 2**30 = 1.07 billion.
With 12bits per channel, you have 36bits total which is 68.8 billions.

GPUs have been processing images in 16-32 bits per plane (high dynamic range) for several years already but displays capable of more than 8bpp are still uncommon... some high-speed panels are still using 6bpp with dithering.
 

mapesdhs

Distinguished
Kevin, I assume you meant 12 bits per channel, not 12 bits per colour.


Btw, SGI was doing 48bit RGBA more than 20 years ago. It's nothing new.
Only the switch to flat panels has meant the fidelity of perceived images
has gone down since then; in the past, CRTs could show the 4096 different
shades of each colour just fine with 36bit RGB; 12bits for Alpha has always
been important for accurate low-light/contrast visual simulation and precision
imaging/video, while 16bit to 64bit greyscale has been used in medical imaging,
GIS and defense imaging analysis for ages aswell, eg. the Group Station for
Defense Imaging was dealing with 60GB 2D images probably a decade before
many of the readers of this forum were even born. :D

Modern display tech is merely catching up to what high-end gfx was doing
in the last century with CRTs. *yawn*

Ian.

 

annymmo

Distinguished
Apr 7, 2009
351
3
18,785
Another supporter of moving to 16 bit per channel.Their tactic of adopting sounds a bit like a patent trap.This greed and the need to get money makes me think there should be more technology where the inventors just get some kind of prize resources and money instead of having to bring it to market the current way. This protectionism is really bad for technological progression.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010


Most stuff you see on TV was encoded by the camera in 4:2:2. Most news footage is 4:1:1 or 4:2:0 depending on the format they use. But id doesn't matter because the encode for digital broadcast is 4:2:0. Most dramatic series are shot in 4:4:4 or some even still on film.
 
Status
Not open for further replies.

TRENDING THREADS