Is there really a difference in the codec? I don't notice any difference other than hi-10bit 720p lagging anime while 8bit 720p no difference in quality and video runs smoother as its less cpu intensive. So why do some people encode using hi-10bit?
Better colors, More colors, etc.
I hate it when watching a movie with a large sky or underwater scene which has a gradient. Fading from light to dark and you can very clearly see color banding instead of a natural fading from light to dark.
To see 10 bit properly you will need a monitor or tv that is 10 bit also.
If not the cpu or video card will have to convert to 8 bit and perform dithering which causes the banding mentioned above.
Because it has less banding at the same bitrate. With 8-bit encoding at some point you get to the point where the image quality (motion, blockiness, artifacts, etc) is almost identical to the source except that the image still has banding on smooth color transitions. To solve that problem, you need an even higher bitrate, which becomes extremely wasteful since most of that bitrate is burned on further unnoticeably "improving" the overall image quality. 10-bit encoding stores higher precision colors, which means that it trades some of that overkill image quality we already have for less banding. Basically 10-bit encoding quality is much more balanced, meaning that you don't need to bump up the bitrate to improve banding specifically.
The color precision in an 8-bit encoded video is usually only around 6-7 bits depending on bitrate. For 10 bits you'll get even more than your 8-bit (per color channel, usually called 24-bit) monitor can display, but thanks to dithering the extra precision can still make a difference. In extremely dark scenes, it's extremely easy to see the difference between undithered 8-bit colors (not H264 8-bit, simply a pixel with 256 different shades) and dithered colors.
I can't really say since I do not encode using 10-bit.
Most people have 6-bit monitors; meaning they have either TN or e-IPS panel monitors. Theses monitors can only truly reproduce 256k colors. Through the use of temporal dithering approximately 16.7m colors can be recreated. Temporal dithering forces pixels to quickly pulsate between two colors to create a 3rd color that the monitor itself cannot reproduce. For example, let's say 6-bit monitors cannot display purple. Using temporal dithering the pulsates between Red and Blue that it tricks your brain into thinking your eyes are see solid purple.
8-bit LCD panels (VA, H-IPS, S-IPS and P-IPS) can truly create 16.7m colors so there is no need for temporal dithering.
10-bit monitors are really monitors with 8-bit LCD panels and a color look up table (LUT). There are over 1 billion colors in a 10-bit LUT, but the monitor can only display up to 16.7m at any given time. It allows for better color accuracy for people who does digital artwork. Using 10-bit color on a 10-buit monitor can cause a bit of lag though because every color code the monitor receives, it basically compares it to the 1 billion available colors on the LUT to determine exactly which one to display.
As for using 10-bit colors for anime? That seems to be a huge resource waste to me, both for encoding and decoding (playback). Unless anime has gotten to the point of photo realism; the actual color palette used in anime is far less than 16.7m colors.
Very interesting but we're comparing the 8-bit version of the video codec vs the 10-bit version here. It has nothing (or at least not much) to do with 8/10-bit monitors. Due to the compression (depending on the bitrate of the video) the number of possible colors is usually lower than what the codec uses. Also anime suffers far worse from banding than live action movies since the scene is usually made of solid colors. Sometimes there's lighting or things like that that create a color gradient that can look horrible. I tried to find an example but this is the best I could find (scroll down a bit): http://hardforum.com/showthread.php?t=1724380
ah so there is a difference? How come I don't notice it in one piece hi10bit? All Yibis did to me seems like its lagged my pc when playback and my cpu single core can't handle my 720p subs anymore. In anycase how can you tell if your monitor or tv can do 10bit? I'm using my pc on Sony LCD KDL52X3500 Its maybe 5=8 years old tv but it does true 24p Cinema and 1080p resolution.
Yeah at the time though it cost me 2700 quid though I got it cheaper than its official price of 3500 back then, I think it was the best LCD but went in debt getting it lol is the 10 Bit Panel the spec in that link that makes it the 10bit? Still I need a decent pc to run hi-10bit cause my old pc is lagging now when i run anything 10 bit encode. Shame that i can't afford a new pc for 3-4 years unless I can score a job.
"10-bit" panels are in actuality 8-bit panels with AFRC (Advanced Frame Rate Control); also known as temporal dithering. A technology similar to what is used in 6-bit TN and e-IPS panels to simulate up to 16.7m colors from 256k actual colors. To reiterate my example above, it is like the panel cannot create purple so the pixels quickly pulsates between Red and Blue so fast that your brain registers solid purple. This applies to 10-bit IPS panels.
There are also 10-bit VA panels and their "10-bit" tech works differently. These panels basically have double the number of sub-pixles that is used to control brightness and also reduce (hopefully) gamma shifting to a certain extent. The sub-pixels are setup in zones and typically only one half of the zones are turned on except when high contrast colors needs to be displayed. Then both zones are activated.
Sub-pixels... Each pixel is made of three sub-pixels; Red, Green and Blue. By varying the intensity of each primary color many shades of colors can be display. When all sub-pixels are turn off you get black. When all sub-pixels are at their highest intensity you get white.