Sign in with
Sign up | Sign in
Your question

Running 1600x1200 on a 1680x1050 lcd....

Last response: in Computer Peripherals
Share
January 21, 2007 2:31:41 AM

So I just figured out today that my friends new wide screen lcd can run 1600x1200 instead of 1600x1050. I want 1600x1200 for thats my fps gaming pref. My friend isnt helping me immediatly because he's engulfed in world of warcraft so can someone tell can I run 1600x1200 without the monitor automatically stretching it to fit the wide screen (thats what his did). Understand the question? I want to buy a 1680x1050 lcd, run 1600x1200 res and and have black edges on both side of the images (which makes sense because 1600x1200 isnt a wide screen res) thanks for your time
camp
January 21, 2007 2:43:09 AM

1680x1050 to 1600x1200 ... there's not enough pixels in the monitor to use the second 1200 resolution.

It sounds like your friend has a higher resolution widescreen (probably the 1920x1200). That is the only way he can fit in a 1600x1200 resolution on a widescreen monitor; a lower resolution monitor (like 1680x1050) cannot go higher than it's highest resolution (such as 1600x1200).
January 21, 2007 10:47:22 PM

:evil:  Got dammit I dont feel like argueing and your pushing my buttons. He has a 22inch kds lcd. The highest res to choose is 1680x1050. I understand where your coming from. :cry:  Umm I agree with you but I watched it happen with my own eyes. He went into the video settings of world of war craft, selected 1600x1200 and then the game restarted showed what appeared to be a 1600x1200 and then a after only a few seconds the res spread out to cover the unused sides of the monitor so that it was a non wide screen res picture that was stretched horizontally..... I dont know what that means but this is what happened.... SO my question remains the same. All i want is a lcd monitor that can display 1600x1200 and 10ms or less responce time! Im a hardcore fps so maybe Ill just get a damn CRT.
Related resources
January 21, 2007 11:00:29 PM

Quote:
:evil:  Got dammit I dont feel like argueing and your pushing my buttons. He has a 22inch kds lcd. The highest res to choose is 1680x1050. I understand where your coming from. :cry:  Umm I agree with you but I watched it happen with my own eyes. He went into the video settings of world of war craft, selected 1600x1200 and then the game restarted showed what appeared to be a 1600x1200 and then a after only a few seconds the 4:3 res spread out to cover the unused sides of the monitor so that it was a 4:3 ratioed picture that was stretched horizontally..... I dont know what that means but this is what happened.... SO my question remains the same. All i want is a lcd monitor that can display 1600x1200 and 10ms or less responce time! Im a hardcore fps so maybe Ill just get a damn CRT.


You could try the Samsung 204B. Samsung claims 5ms.
January 22, 2007 2:06:39 PM

First thing you need to understand is that LCDs have fixed position and size pixels. Your monitor has EXACTLY 1680 X 1050 pixels.

BUT, LCDs can take different resolutions as scale them to fit on the display. This does not improve the source or generate extra optical pixels if the source is larger than the display.

Thus, a 1680 X 1050 pixel LCD can pretend to have 1920 X 1200 (for example) by throwing away every 8th pixel, both horizontally and vertically. (it is usually not fast enough to blend the 8 pixels into 7 so it throws one away unless its a fairly high end/expensive display). It can also pretend to have 840 X 525 (as a unusual example) by duplicating every pixel h/v.

Also, the LCD can stretch horizontally a 1280 x 1024 image to fit 1680 X 1050 by repeating 5 of every 16 pixels.

Many widescreen monitors in their onscreen controls have the ability to disable widescreen stretching, thus they only scale to fit vertically and scale the horizontal the same factor and leave the rest on each line blank.

Thus, the best image you can get, is to set your game output to 1280 X 1024, and set the monitor to not stretch horizontally. This lets the GAME do the sizing since it is much better at it then the monitor is. Setting the output to 1600 X 1200 generates more pixels that JUST GET THROWN AWAY!! This is not like anti-aliasing where the graphic card blends the pixels in a ultra-high resolution and outputs the blended but still high resolution. (otherwise we would turn off anti-aliasing, set the output resolution to 2560 X 2048 and let the monitor do the anti-aliasing, LOL).
January 22, 2007 2:43:39 PM

My question is exactly the opposite. I have an NVIDIA card that supposedly supports 1600x1200 resolution. I am looking at a new LCD monitor that is 1680x1050 wide screen. Will the card work right with this resolution? Will there be problems in the display and if so what kind of problems? The other I am looking at are 1440x900 or 1280x1024.

My other question is setting the resolution on the computer. One of the monitors has a whole list of settings with the native resolution being the first. That resolution goes beyond the limits of the card. When I set the resolution what is the best way to do it - trying to get as close to the ratio as the native resolution or setting it to the normal resolution for the size as shown by the other monitors?

Sorry to be such a noodge but I am just getting into this and I find it very confusing. Read the reviews and they all say that the specs from the manufacturers are all pretty much BS and the makers are playing with the figures to make them appear better. Sounds normal for makers to me.

Thanks for any assistance. :? :? :?: :?:
January 22, 2007 2:52:23 PM

Quote:
My question is exactly the opposite. I have an NVIDIA card that supposedly supports 1600x1200 resolution. I am looking at a new LCD monitor that is 1680x1050 wide screen. Will the card work right with this resolution? Will there be problems in the display and if so what kind of problems? The other I am looking at are 1440x900 or 1280x1024.

My other question is setting the resolution on the computer. One of the monitors has a whole list of settings with the native resolution being the first. That resolution goes beyond the limits of the card. When I set the resolution what is the best way to do it - trying to get as close to the ratio as the native resolution or setting it to the normal resolution for the size as shown by the other monitors?

Sorry to be such a noodge but I am just getting into this and I find it very confusing. Read the reviews and they all say that the specs from the manufacturers are all pretty much BS and the makers are playing with the figures to make them appear better. Sounds normal for makers to me.

Thanks for any assistance. :? :? :?: :?:


There are only 2 realistic choices:
1. The video card supports the resolution (not "close", but exactly... and it will state so)

2. You can set the driver to support the resolution you want.

Anything other will not work at optimum.
January 23, 2007 1:27:50 PM

Nvidia has released new drivers for almost all of their chips (any within the last 5 years or so?) that support widescreen resolutions like 1680 x 1050.
Any Nvidia chip that can output 1600 x 1200 will be able to (using the latest driver) also output 1680 x 1050 since it has less actual pixels (and most lcds are limitted to 60 hz anyway).
So if your Nvidia card says it can do 1600 x 1200 at 60hz, then it can be programmed using the latest Nvidia drivers to output 1680 x 1050 at 60hz and be compatible with the native res of your lcd.
January 23, 2007 1:55:59 PM

Thanks for the responses. You people have answered my question and I appreciate your help. It is good to see the community band together to help each other out. Reminds me of the old days when it all started out and we kept in touch with all our little tips and ideas of how to do things. :D  :D  :lol: 
!