LCD question

Bingy

Distinguished
Dec 16, 2008
278
0
18,780
Question 1: Hi, say you had two 24" LCD computer monitors, the first one called A had a native resolution of 1900x1200 and the second 24"called B had a native resolution of 1650x1050. If you turned down the resolution of monitor A to 1650x1050 and left monitor B at its native resolution(1650x1050) which would look better and why? (how could you tell)

Both of these monitors A and B are excatly the same expect monitor has a native resolution of 1900x1200 and monitor B has a native resolution of 1650x1050.

Question 2: You have one 24" monitor (called A)with a native resolution of 1900x1200 and a second 22" monitor(called B) with a native resolution of 1900x1200.

Which would give you the most FPS on a computer if every time whas the same expect you changed the monitors are, monitor A, or monitor B? Please say why.

Thanks

 
To answer question one, a monitor will always look better at native than a higher res monitor will look running out of native. The 1680x1050 will therefore look better at 1680x1050. However, the 1920x1200 monitor will look better at 1920x1200 than the 1680x1050 monitor will look at 1680x1050, due to the smaller pixel pitch and higher resolution.

For question 2, there is no difference. The graphics card cares how many pixels it is driving, not how big the screen is.
 
1. The image quality hit is due to interpolation. There are a specific number of pixels on an LCD screen. When you set the monitor to use less than native resolution (a.k.a. the max resolution), the monitor tries to place each pixel in the best place possible position.

2. Frame rate performance is dependent on resolution not size of the monitor. Performance at 1920 x 1080 is no different on a 32" LCD HDTV than vs. a 65" Plasma HDTV.