You can't force the projector to shoot an image that your video card itself does not support...my experience with projectors has been that you have to match the video card resolution to the supported resolutions of the projector...in this case, the max res seems to be 1280x1024.
The projector has the following specs: 1024 x 768 XGA Native
800 x 600 SVGA Supported 1280 x 1024 SXGA Supported
1400 x 1050 SXGA+ Supported
My MSI x1900xt has the following specs:
640 x 480 200Hz
800 x 600 200Hz 1024 x 768 200Hz
1152 x 864 200Hz 1280 x 1024 160Hz
1600 x 1200 120Hz
1920 x 1080 120Hz
1920 x 1200 100Hz
1920 x 1440 90Hz
2048 x 1536 85Hz
As you can see, the vcard cannot output at the maximum 1400X1050 res that the projector can broadcast. I know that higher resolutions sent to the native projector res of 1024X768 are downscaled to exactly that and can result in quality loss, but the 1280 I tried last night was CLEARLY better than the native.
Why would the image of 1280X1024 look better on the projector vs its native res of 1024X768 if it was downscaling it anyway????
I want to try the 1400X1050 to see if it downscales that accordingly, and results in a better image (just like the 1280X1024 did).
Noting that everything gets downscaled, does it make a difference? I think so. What do you think?
Anybody have any ideas how to do this? I've heard about PowerStrip. Does it work? What are the key issues when using software to force a resolution?
Well it does work, as do a few other programs I've used, but the main thing is to check and see what your screen supports. The Infocus will support higher resolutions, but it's still going to downscale it because the pixels are fixed @ 1024x768, so you're going to be taxing your card more for a higher resolution, that your projector will dumb down anyways.
IMO, just use the 1024x768, it's better that your card output to native, not work so hard, and since you're trying to output a non-standard resolution anyways, you can at least have it do less work trying to make 720P-1080i/P match 14x10.
Of course it depends on what you're watching and whether the image is low resolution to begin with and you are essentially doing two conversions in the process. Some images will look better because they play well to smoothed edges, and others will look worse. Likely the 10x7 is closer to what you're supposed to see.
There are other apps like Refreshforce and MultiRes, but be advise playing with this can cause damage to your display or in this case projector. As wa mentioned Omega has this option thanks to their inclusion of an app (MultiRes IIRC) in the moded driver package.
If you find you need the resolution in other apps and it doesn't appear then you can try regedit hacks, but I'd say wait until you're more comfortable with what that involves before going that route, because you could be stuck with having to format/re-install if you mess that up good, and of course destroying your projector is always a possability with forcing resolutions/refresh rates. Less risky with LCDs but should always be a consideration.
Good point. I am still confused why the movie I tested last night looked better with the downscaled 1280X1024, vs the native 1024X768. I changed the settings in Catalyst. It was a big difference. Why would this be the case?
Like I mentioned some things will benefit from as much smoothing as you can provide. what upscaling and downscaling essentially do are to blur the image algorythmically so if the iimage benefits from smoothing (something where fine edges are not as appreciated, such as motion , ie action films) then it would benefit from having feathered edges and less detail definition. Something with stunningly beautiful detail would benefit less from that. It will vary from source to source. Also it depends on how the image is being processed. If the card does a mediocre job of upscaling the image to either 768P or 1024P, but then the projector compensates by doing a better job downscaling from 1024P to 768P then it would often give you a better looking result for that situation. The problem is you aren't going to ever have native content from studios to compare because none of those resolutions are standard. But where you would likely see the most difference is when looking at a computer image (either text or a picture) and there you should see larger differences between native and non native. Because for video you will have some level of conversion no matter what.
I guess as long as movies are visually appealing, thats fine with me. They work great on the 1280X1024 res.
Fonts on the native where illegible. 1024X768 just sucked. I could barely read them (big shadow effect and jumbled). On the 1280X1024, the are much better (can see really small fonts relatively clearly). I wonder if the higher res will make the fonts even better.
Thanks again for all your help. I cannot see why everyone doesnt have one of these babies. They are just purely AWESOME!!! You should see WoW when my warrior is about 1.5 metres tall!!! Wooohoooo!!!!!