Resolution

noorerino

Distinguished
Jun 27, 2010
83
0
18,630
Hi,

I have a HP w2207h lcd monitor which has a native resolution of 1680x1050. However, since I bought a hdmi cable, I now have the option to set the resolution at 1920x1080. Is it a true 1080p res. ? What's going on lol?

thanks for you time
 
Solution
Rule #1 for LCD monitors: Always run them at native panel resolution if you can. This will give you the least input lag time, the best picture, and so on.

People usually shop for a video card and CPU based on the screen resolution they are running their games at.

So yes, leave it at 1680x1050. That is the native resolution of your panel inside the monitor.

noorerino

Distinguished
Jun 27, 2010
83
0
18,630



what is scaling? sorry for my ignorance ..
 

Maxx_Power

Distinguished


Scaling is to downsize or upsize an existing image (or other things) so that it fits on a given grid resolution.

If you set at 1080p, and the monitor can only do 1050, then the extra pixels you sent to the monitor has 2 options,

1) Truncate and discard, so you lose a portion of the video

2) Somehow take the local average of 1080 pixels to produce 1050 pixels. This is scaling.

It usually results in an increase in input lag time (more time for electronics to process the signal) and visual artifacts due to scaling process.
 

noorerino

Distinguished
Jun 27, 2010
83
0
18,630
Also, when i run batman arkham asylum's benchmark, i get higher fps if i set the res. to 1680x1050. Doesn't that mean that there is an actual difference and that i may have 1080p res. ?
 

Maxx_Power

Distinguished


Any benchmark will run quicker at lower resolutions. When you set it to 1080p there is 1920x1080 number of pixels to crunch, where as at 1680x1050, there is less pixels to crunch. The benchmark is actually rendering more pixels in 1080p, but your monitor is not able to display it. Because...

You don't actually HAVE that many pixels in the screen itself, the panel is clearly 16:10 aspect ratio, it can't magically turn into a higher resolution panel, with a 16:9 aspect ratio. The receiving logic inside the monitor is down-sampling (scaling) the 1080p to match the local 1050 resolution.

Think of it another way, when you boot up the computer, the monitor is running at 640x480, yet you usually see the whole screen being used. In that case, the lower resolution material from the computer's BIOS POST process is being up-sampled to the native resolution of the monitor via scaling.
 

noorerino

Distinguished
Jun 27, 2010
83
0
18,630



so i just better use 1680x1050 res. Adjusting the monitor to 1080p res. is physically impossible so i'm just making my monitor do more work and it doesnt need that. I still have a full screen at 1680x1050 anyways, so there;s no point in putting it higher..
 

Maxx_Power

Distinguished
Rule #1 for LCD monitors: Always run them at native panel resolution if you can. This will give you the least input lag time, the best picture, and so on.

People usually shop for a video card and CPU based on the screen resolution they are running their games at.

So yes, leave it at 1680x1050. That is the native resolution of your panel inside the monitor.
 
Solution

noorerino

Distinguished
Jun 27, 2010
83
0
18,630



thanks for the info's
 

TRENDING THREADS