I read alot about running LCD monitors at their native resolution when possible to maximise display quality. Now with some of my games I have to make a compromise as I lose alot of FPS at native so need to use a lower resolution for decent gameplay.
Should I run at native with lower in game quality settings, like texture quality etc., or should would it be better to run at lower resolutions with higher quality settings?
I've read articles on the web where they have images comparing non-native vs native and when put side to side like this can see the difference. But on the whole when I play games everything moves faster and I can't discern much difference if I switch from native to one alittle lower.
If you don't see a difference then do what works best for you. I personally leave mine at native, but my hardware can handle it so it's a non-issue really. The only thing is keep the refresh rate at default or you will burn out your LCD. Other than that, it's all up to you
There are many more factors involved in getting good smooth images /game play than just the monitor size/res as you say the in game vid settings can make or break the experience.
Take oblivion for example lots of things to tweak but neeeds some experimenting to see which make a worthwhile diff. I personally would run at native and bump down the in game settings to get a decent fps first if that failed then i would consider lowering the res and start bumping stuff up until it go too slow to find a happy medium.
May i ask what is your situation then it seems that you are either running a low end card or a big monitor or is it the newer games that are stretching things for you?
If you really care about the difference, you'll want to upgrade or maybe even replace your system to match your games. Without knowing anything about it or what games you play, who knows what to do there.
If you don't or can't do that, then just adjust all the settings for whatever works best for you.
Thanks for the input so far all. Generally I'm just interested in how things work and educating myself so that I can eek out better performance.
I like to game but money and budget will always be a factor so I will most likely always be compromising.
It wasn't until recently that ignorance was bliss as I kinda played games with the recommended settings. But seeing the same games on higher spec machines really opened my eyes to how much more eye candy offers to the experience.
I'm running a 19" widescreen LCD with a native res of 1440 x 900 with a Geforce 7600 GT (I did buy a Radeon X1950 Pro which failed after 20 minutes, so I ended up returning it and getting the 7600).
Me being a geek I was curious about what you guys did and if you could notice the difference at non native resolutions.
I miss CRT's with respect to not having to worry about which resolution to play in.
The way interpolation works, there are resolutions that will look better than others on an LCD. For instance on a 1280x1024 display, 640x480 can look pretty good because 640 is exactly 1:2 of 1280. The verticals would be a bit more messed up though since 480 is 15:32 of 1024. The simpler the ratio, the better the effect. If your screen has 1650 pixels of length and you try a resolution of 1532 in length, it will look horrible (766:825).
Try out different resolutions until you find something that looks not too bad and still gives a good framerate. 960x600 with 2 or 4x AA should be a good trade-off on your screen (2:3 of 1440x900).