Do wide gamut monitors cause more eye strain?

I have some light sensitivity and I was thinking about to buy Dell U2415...
But I heard that Wide Gamut colors have too much color saturation....
Does it cause more eye strain?
(I work with web design/development. I dont edit images )


1 answer Last reply Best Answer
More about wide gamut monitors eye strain
  1. Best answer
    LOL. No. The real world has the largest color gamut your eyes can see. A standard sRGB monitor can only reproduce a small fraction of those colors. The colored area of a CIE chart is roughly the entire color spectrum your eyes can see. The white triangle here is sRGB.

    AdobeRGB (the color space used by most wide-gamut monitors) is bigger, but still doesn't cover the entire range of human vision.

    If a wider gamut caused eyestrain, your eyes would be hurting all the time from looking at the real world.

    Eyestrain has many causes, but the main ones are:
    • Brightness - most people set their monitor too bright. The brightness of your monitor should be about the same as the ambient conditions. i.e. If you took a photo of your room and displayed it on the monitor, the brightness of the image should be the same as the actual room. When you set the monitor too bright, your pupils have to dilate and constrict as you look at the monitor, to things in the room, and back at the monitor. This gets tiring after a while.
    • Flicker, especially with motion or video. Pretty much everyone can see 30 Hz flicker. Most people can see 60 Hz flicker. A few can see higher (100 Hz to several hundred Hz - I am one of these unfortunate people and PWM drives me nuts). Your peripheral vision is more sensitive to this than your fovea (central vision). Unfortunately, most LED backlights use PWM (flickering) to modulate brightness. If the frequency is too low, it creates an image which appears to strobe as you look around it, which can cause fatigue.

    Most wide-gamut monitors let you switch between sRGB and AdobeRGB mode. You have to understand that the photo and video standard right now is sRGB, so most pictures and movies are encoded with the RGB values aimed for that color space (e.g. 255 Red = max red of sRGB). So to view most pre-produced imges and video "correctly" on a wide-gamut monitor, you have to use it in sRGB mode. If you try to view these images native in AdobeRGB mode, it will oversaturate the colors (255 Red = max red of AdobeRGB). To properly use AdobeRGB mode, you need to view the images in a program which is colorspace-aware (like Photoshop), and tell that program that your display is using AdobeRGB color space. It will then map the sRGB image to AdobeRGB color space, producing the correct colors for the sRGB image, while still allowing the more saturated colors that AdobeRGB provides. (255 Red -> 220 Red or whatever in AdobeRGB, which is equivalent to 255 Red in sRGB.)

    Or you can just leave it in AdobeRGB mode and live with oversaturated colors. In theory this could cause eyestrain because it leads to slightly brighter images. But as I said, if you adjust the monitor brightness correctly, it will just look like an extremely colorful object in RL instead of a dull colored object.

    Edit: From a technical standpoint, standard LED monitors use blue LEDs with yellow phosphors. The blue light excites the phosphors, and converts some of the blue light into yellow light. The yellow light has enough red and green in it to cover the sRGB color space. Wide-gamut LED monitors use separate blue, red, and green LEDs. This allows deeper color shades, thus covering AdobeRGB or larger.
Ask a new question

Read More

Light Monitors Dell