Sign in with
Sign up | Sign in
Your question

16:10 to 4:3 for games?

Last response: in Video Games
Share
March 21, 2008 11:54:49 PM

My 22-inch widescreen monitor has a native resolution of 1680 x 1050 which is a ratio of 16:10. I want to play Rainbow Six Las Vegas, but that game only has resolutions for 4:3 and 16:9. And any resolution I play at, it looks terrible, since it stretches the image. Is there any way I can resize my screen size to fit one of those resolutions? Either with software or messing with my graphics card? I have a 8800 GTS.

I googled this problem and found some .bin files I can download for several 16:10 resolutions but I didn't see any change, so I probably didn't get it to work.

More about : games

March 22, 2008 6:26:57 AM

If you cannot find a way to patch the game, you're probably screwed.

For some reason LCD screens are just horrible when they have to operate on non-native resolution. Personally I don't really understand why that is, so it has to be a hardware reason because in terms of (driver) software this is almost a trivial issue.

It's one of my major beefs with LCD screens nowadays because it effectively has robbed us of one of the options to extend the lifecycle of a gaming PC. At some point in time (after about 2 years?) you have to start reducing the game resolution in order to enable your older hardware to keep spitting out acceptable framerates. Because of the LCD unability to cope with this with acceptable image quality, we're now almost forced to do a GPU upgrade.
March 22, 2008 9:23:58 AM

Hell after 2 years its time anyways.


The whole LCD thing has to do with fixed pixles. CRT monitors dont have that "problem" its one thing i can handle since the LCD monitors i have gotten look WAY better then even my still used 21" pro series viewsonic 815 CRT.

Related resources
March 22, 2008 10:16:31 AM

BigMac said:
If you cannot find a way to patch the game, you're probably screwed.

For some reason LCD screens are just horrible when they have to operate on non-native resolution. Personally I don't really understand why that is, so it has to be a hardware reason because in terms of (driver) software this is almost a trivial issue.

It's one of my major beefs with LCD screens nowadays because it effectively has robbed us of one of the options to extend the lifecycle of a gaming PC. At some point in time (after about 2 years?) you have to start reducing the game resolution in order to enable your older hardware to keep spitting out acceptable framerates. Because of the LCD unability to cope with this with acceptable image quality, we're now almost forced to do a GPU upgrade.


Well think about it.

An LCD has a set number of pixels, say 1680x1050.

If I run an application at 1440x900, then that 900 has to be stretched to 1050.

Obviously you cannot have 2 "real" pixels represent 1 of the output pixels - that would require 1800 vertical pixels.

So, you ens up with some of the output pixels being represented by two real pixels, and some by one.

This means that the pixels you see are all different sizes and look horrible.
March 22, 2008 2:38:58 PM

I have had to use this site for a lot of older games...
March 22, 2008 4:44:47 PM

darkstar782 said:
Well think about it.

An LCD has a set number of pixels, say 1680x1050.

If I run an application at 1440x900, then that 900 has to be stretched to 1050.

Obviously you cannot have 2 "real" pixels represent 1 of the output pixels - that would require 1800 vertical pixels.

So, you ens up with some of the output pixels being represented by two real pixels, and some by one.

This means that the pixels you see are all different sizes and look horrible.


Come on, think for a minute. This is bullshit.

It is very easy to interpolate from 1440x900 to 1680x1050 in such a way that it looks perfectly fine. LCD's seem to have an issue with proper interpolation, which means that apparently it is not solved on the driver level and that can only mean that it is expensive in terms of performance to handle it there. That things look terrible if you do not do anything that is (too) obvious.

Btw, same remark to the poster before you. "fixed pixels" my arse.

The thing is: doing proper interpolation or subsampling (in case the game resolution is above native LCD resolution) is probably more expensive performance wise than setting the game resolution to native. However, bottom line is still that reducing the number of pixels to display (with acceptable image quality) has been lost as a tuning option.
March 22, 2008 7:48:02 PM

for nvidia cards (probably ati too) you can set the LCD not to scale in the control pannel. You'll get black bars around the edges instead of streatched pixels (looks better imo). 1280X1024 uses almost all of the 1680X1050 pixels top to bottom.
March 22, 2008 7:48:16 PM

infornography42 said:
I blame Microsoft.


Hurray! Oh boy I love to play the blame game! Lets see..Hmmm... I blame.. I blaaaame.. Intel! Yeah Intel!
March 22, 2008 9:18:31 PM

Well you have to look at it this way... who controls the default and therefore most widely used driver for LCD screens as well as the platform for which it would run?

It is clearly Microsoft's responsibility to make this work.

It would be one thing if widescreen LCDs were a brand new phenomenon that noone could have seen becoming as popular as they have or even LCDs in general. They were being widely used before XP hit the market. Honestly, LCD scalability should have been incorporated into XP. It wouldn't have even been a complicated or difficult change.
March 23, 2008 7:56:54 AM

BigMac said:
Come on, think for a minute. This is bullshit.

It is very easy to interpolate from 1440x900 to 1680x1050 in such a way that it looks perfectly fine. LCD's seem to have an issue with proper interpolation, which means that apparently it is not solved on the driver level and that can only mean that it is expensive in terms of performance to handle it there. That things look terrible if you do not do anything that is (too) obvious.

Btw, same remark to the poster before you. "fixed pixels" my arse.

The thing is: doing proper interpolation or subsampling (in case the game resolution is above native LCD resolution) is probably more expensive performance wise than setting the game resolution to native. However, bottom line is still that reducing the number of pixels to display (with acceptable image quality) has been lost as a tuning option.


well you can try all the tricks you want but pixles on a lcd are fixed they dont gorw they dont shrink they dont move they sit still. Do all the tricks you want but it will never look good until the nature of a LCD changes.like the man said you can scale but the pixles will never match up perfectly as they do in the native resolution. i dont care what driver you use GL with that.
March 25, 2008 10:32:49 PM

Have you tried using the Flat Panel Scaling found in nvidia's control panel and setting this to 'NVIDIA scaling - maintain aspect ratio'?
March 27, 2008 2:30:53 PM

septic said:
Have you tried using the Flat Panel Scaling found in nvidia's control panel and setting this to 'NVIDIA scaling - maintain aspect ratio'?


exactly. use nvidia's scaling for 1:1 and then choose either the 16:9 or 4:3 resolution in game. you should have black bars on all around with game sentered in middle. your monitor may also provide scaling (my gateway does). stretching is ugly so let your monitor or graphic driver render the proper aspect ratio for you. i am not sure if ati has this option but i imagine they must.
March 28, 2008 2:43:42 AM

You should use the Nvidia settings and choose "don't resize" for best effect, leaves black bars on all sides if neccesary, but it's better than interpolating those last 50 lines
March 28, 2008 10:32:41 PM

Wait, the pixels on my Viewsonic CRT from 10 years ago could resize?
!