[Display] Will my monitor burn?

its_faith

Distinguished
Jan 2, 2012
10
0
18,510
Now here is the thing.

I have a Syncmaster SA300 20 inch monitor. Maximum resolution appears to be 1600x900.

How ever, i have installed Ati tray icons for OC utilities. And now in my display resolution options, there are new resolutions.

I have tried all, and all work just fine. Only works in windows, games seems doesnt show the resolution.

In the information menu, of the monitor, it displays 1920x1080, 67.5 kHz, 60Hz PP, analogic. Before, when i was using 1600x900, it showed the same info, but instead of 67.5kHz, it showed only 60.

Now i want to keep 1080p, altough i really think is some kind of 1080i. or some trick xD, but will i have any trouble? something like eletrical or anything??, i can upload an image if necesary
 
Solution
Your monitor can't display a higher resolution just because the computer tells it to. It only has so many pixels. Setting it at a resolution higher than what it supports will do nothing but lower your image quality by making the screen a bit fuzzy.

If you want your resolution to be 1920x1080, buy a new monitor. You can't trick your old one into doing it.

Edit for clarification:

What's happening is your computer is sending a display signal to the monitor that represents 1080p. Your monitor, being fairly smart, figures out that it can't display that 1080p image. Instead, it will convert it to 1600x900. What you're seeing is exactly the same resolution as before, it's just been garbled slightly in the conversion (thus the fuzziness)...

willard

Distinguished
Nov 12, 2010
2,346
0
19,960
Your monitor can't display a higher resolution just because the computer tells it to. It only has so many pixels. Setting it at a resolution higher than what it supports will do nothing but lower your image quality by making the screen a bit fuzzy.

If you want your resolution to be 1920x1080, buy a new monitor. You can't trick your old one into doing it.

Edit for clarification:

What's happening is your computer is sending a display signal to the monitor that represents 1080p. Your monitor, being fairly smart, figures out that it can't display that 1080p image. Instead, it will convert it to 1600x900. What you're seeing is exactly the same resolution as before, it's just been garbled slightly in the conversion (thus the fuzziness).

Games, however, work a bit differently. They do not take their resolution cues from Windows, and instead handle it all themselves. This is why you can't select the resolution in games, they understand that your monitor can't display it.
 
Solution