difference between 1.07 bit color monitor and 16.7 million color

danishlynx

Prominent
Oct 28, 2017
3
0
510
my title is in reference to Samsung CHG70 (1.07 BILLION) AND DELL S2718D Monitor
i want the best color production while watching movies
but Samsung has an anti glare coating which i think will reduce the vibrancy of monitor like <mod edit> even after having 1.07 billion color palette
and in dell its a glossy panel which even after having 16.7 million color will give me more vibrancy on watching
need your advice guy

<Moderator Warning: Watch your language in these forums>
 
Solution
The color depth (palette size) is a separate issue from the vibrancy of the monitor, so it won't affect it. A monitor with 1.07 colors doesn't necessarily mean that it can display additional colors that are outside the range of a monitor with 16.7 million colors. It just means it can display more "inbetween" colors for smoother transitions in gradients.

For vibrancy, the color depth/palette size won't tell you anything, you need to look at gamut coverage to get a general idea, usually represented as "100% sRGB" or something like that.
The color depth (palette size) is a separate issue from the vibrancy of the monitor, so it won't affect it. A monitor with 1.07 colors doesn't necessarily mean that it can display additional colors that are outside the range of a monitor with 16.7 million colors. It just means it can display more "inbetween" colors for smoother transitions in gradients.

For vibrancy, the color depth/palette size won't tell you anything, you need to look at gamut coverage to get a general idea, usually represented as "100% sRGB" or something like that.
 
Solution