So I was looking around at computer monitors and then I was looking at TVs. I noticed that the very best TVs have about a 38000:1 contrast ratio and cost almost $5000. 1080p computer monitors have up to 100,000,000:1 contrast ratio and they only cost around $300. Besides the difference in size, what justifies this HUGE cost difference? Especially since the computer screen theoretically look better anyway.
The TVs will be plasma displays that can achieve 38000:1 as *static* contrast. Computer monitors claim 100,000,000:1 as *dynamic* contrast, whereas the static contrast is only around 1000:1, or in some newer AMVA monitors 3000:1. The static contrast is the ratio between luminosity of black and of white when displayed simultaneously on the screen, whereas dynamic contrast compares an all-black screen (where the backlight can be turned way down or even off) to an all-white screen (where the backlight can be turned all the way up).
Yup, I'm guessing that's the answer. The computer monitors basically just turn themselves off so they emit no light. Kind of cheating if you ask me, but I guess there isn't really industry standards on things like that.
Well TVs are not made to be stared at up close. Moniters are better because the individual pixels are more dense resulting in more color production and detail. There are two types of contrast ratios, Dynamic and Native. Dynamic is not realistic or an estimate by the manufactorer due to the lighting conditions they had set up. Native is proven and 100% accurate which means its way more reliable. Althogh there is an exception for certain companies such as AOC. AOC has a contrast system that automaticly ajust to the perfect setting no matter what your viewing. Since AOC's feature even turns of parts of the moniter to make immersive blacks in gaming scenarios, and uses all high quality componets such as Samsuang LCDs,and LEDs, I highly reccommend it for you.