I'm going through some issues that are driving me nuts.
I hope I can get some help. Sorry for the long text.
Whenever I play a game, the graphis become blurry and the ingame text becomes really fuzzy, "flickery" and impossible to read (to the point where it waters the eye). It started in World of Warcraft and I thought it was an issue with that specific game, but the problem persisted in other games as well.
When I minimize the game, the whole screen is blurry, but it gradually gets back to normal within a few seconds/minutes. Upon reentering the game, everything is then normal, but the problem restarts after a few seconds/minutes.
I found a topic in this forum about someone with a somewhat similar issue: http://forums.guru3d.com/showthread.php?t=339191.
Just like him, everything worked fine in my previous system, and I was using this very same monitor. The problem started after I upgraded my PC. Unlike him, however, I get the issue in both fullscreen and windowed mode.
I tried to fix it by following the solution proposed in that topic (see post 21), but, much to my surprise, my Nvidia control panel doesn't have the option shown there ("content type reported to display"). It should be right under "digital vibrance and hue", but in my control panel there's nothing under it.
I assume the "content type reported to display" thing was removed since the date of the topic, or it's simply not available to me. For that reason, I can't verify the only potential fix I found.
This is really getting on my nerve, since it's impossible to play anything as is. The problem can't be reproduced in screenshots, and is only triggered with games. Everything else works fine.
Ever since I changed my PC, I'm no longer able to set my games to the monitor's native resolution (1920x1080 @ 60Hz).
World of Warcraft has DX9 and DX11 options to be chosen from. When I chose DX11, things work fine. When I chose DX9, however, 1920 is not available. Same goes for Rift, except that Rift only supports DX9 and I'm then forced to play with a crappy, messy resolution.
That said, I'm assuming games that support DX11 will run properly. Games that don't support DX11 will always lack the 1920x1080 option (which sucks, since I'm still to play some DX9-only games).
It might have something to do with the fact that I had to manually set my desktop resolution to 1920x1080 (under the Nvidia control panel), since it wasn't available by default. It then became available to be chosen from in my desktop, but not in my games.
I'm running with properly updated drivers.
My system configuration is as follows:
Intel i7 870 @ 2.93GHz
Nvidia GeForce GTS 450
it sounds to me like you need a driver for your monitor that is missing on your system. Have you checked the manufacturer website?
Another thing that I'm not following you say the issue is not captured in screenshots? so it's a monitor only probem? Can you hook up another monitor to your system to verify whether it's the monitor or the pc hardware?
what do you use to connect to the monitor? vga or dvi?
I have the same exact problem as Issue 1, screen shot thing and everything. My Graphics card is a NVIDA GeForce GTX 760, I don't know anything about this stuff. sorry I don't have the answer but I some on dose for the both of us.