1. All updates are present. Windows is fully updated as is DirectX, the Nvidia drivers, and more.
2. With the monitor in "Full ratio" (that is, it stretches to fill the screen) the monitor works decent (obviously not great since it's not the native resolution) for all resolutions except 1680x1050 and 1600x900. On those two resolutions there is significant artifacting.
3. With the monitor in "Aspect ratio" (that is, it always uses the native resolution, just puts up black areas around the edges) the same thing occurs. Looks BEAUTIFUL in all resolutions except 1680x1050 and 1600x900. Under those 2 resolutions there is, again, significant artifacting.
4. Plugged the monitor into my old PC with an ATI Radeon 9800 128mb which is rated to go well over 1600x900. It artifacts the same when set at those 2 resolutions.
5. Tried plugging in two CRT monitors into the 8800 GTS but failed. Not sure if this is because of a bad DVI to VGA converter or what. This is really inconvenient as it would have settled the issue for me.
My problem with going with my instinct and blaming the monitor is that it doesn't make sense! If we were JUST looking at "full ratio" I could blame the artifacting on a problem the monitor has with displaying its native resolution (1650x1080). However, the same damn thing happens under aspect ratio, where it is ALWAYS in its native resolution and looks BEAUTIFUL at 1440x900 (one step below 1600x900). If the problem was with the native resolution, why isn't it occuring on ALL resolutions when set to aspect ratio?