So I have a nVidia 9800 GTX+ 1GB video card that isn't that old. Yesterday I hooked it up to my HDTV in my living room and the screen went crazy and unreadable. Impossible to see anything. So I unhook it and hook it back up to my monitor. Problem still happens. Artifacts in any resolution higher then 800x600. Also, completely unusable when I raise the refresh rate above 50 hz.
Now, the odd problem is, this video card works perfect with my old monitor, just not the one im currently using which is a 1080p HD one. I really dont want to go back to my 17" LCD that only does 1280x1080.
Even weirder, is the 1080p monitor works perfect when using the on-board Radeon HD4200 card. So both the video card (9800 GTX+) and the 1080p monitor work fine, but they dont want to work together. The 1080p monitor only works with the on-board video card, and the 9800 GTX video card works perfectly with my 17" monitor. So. What could be the problem? I'm stuck. No idea if its the video card or the monitor since both of them work fine in different situations.
I already tried dusting out the PCI-E card, but its new enough that not much dust was even on it. I have thought about trying to underclock the video RAM on the 9800 GTX+, but it works perfect with my other monitor and would hate to cause more problems if thats not the reason.
Also, could a bad DVI->HDMI adapter cause this problem? I just recently bought a new game I've been playing in 1080p and its killing me to not be able to play. I work as a network engineer so I'm pretty proficient in most computer problems, although this one seems to have me stumped. Any help would be greatly appreciated.
I'm currently using the 1080p monitor in 800x600 with a 75 hz refresh rate. Anything higher causes artifacts, and if I get even close to 1080p the screen becomes completely unreadable.