1080p fuzzy image?

zmacleod1

Distinguished
Feb 20, 2010
16
0
18,510
Okay -- so I recently purchased a 24 inch View sonic 2m 1080p monitor, turns out the DVI , HDMI ports were both bad, and for some reason the monitor would not go past 1600 by 1200 resolution. ( just said out of range) Bunch of dead pixels, etc.

So I returned it, and got a similar 24 inch Viewsonic monitor ( out of stock, didn't feel like waiting ), except this one was a 5m and it also lacked an HDMI port. Whatever, no big deal.

http://www.viewsonic.com/products/desktop-monitors/lcd/value-series/va2431wm.htm

So, I get home, rip the packaging open.
hook up the monitor, plug in the VGA cable ( didn't come with a DVI ) turn it on, starts up, looks very nice at 1600 by 1200 .
I pop the disk in , install the drivers ; then switch over to 1920 by 1080 and ....

It works, but the image is extremely fuzzy, it gives me a headache. It also appears like it almost shakes, very bizarre. Additionally, whenever I put a window in the middle of the screen, a ghost image of it appears in the bottom left-hand corner.

WTF

I'm running a i7 930, 4 gigs of RAM, Asus p6t mobo, and an XFX HD 5770

Can this be an issue with the graphics card? Is it possible that the VGA cable I'm using can't handle 1080 awesomeness?
It's quite frustrating that BOTH monitors I received from view sonic refuse to go over 1600 by 1200.
I have the latest Catalyst drivers installed and I'm effectively at lost. Looked all over the internet, couldn't find anything like this.
It seems very unlikely that two monitors back to back would be acting strangely like this. Any help or suggestions would be GREATLY appreciated.
 

COLGeek

Cybernaut
Moderator
Do you have access to a DVI cable that you could swap from another system to check out the monitor? If so and if still fuzzy, return the Viewsonic.

BTW, are you using "stock" settings in Catalyst or have you been messing with the monitor settings? I recommend setting Catalyst to default settings in the event you have been trying to make adjustments. That will help forum members providing better assistance.

You are right in that the analog VGA connection will be challenged (in terms of quality) versus the DVI connection, especially at 1920x1080.
 

zmacleod1

Distinguished
Feb 20, 2010
16
0
18,510



Okay -- so I just bought a DVI cable, plugged into the graphics card and then to the monitor, and it's not getting signal.
The weird part about this is that my graphics card doesn't even have a VGA port, I have a little DVI thing that I plug the VGA cable into so that my Graphics card can support it.

Tried going into the monitor settings and changing "input" from D-SUB to DVI , still nothing.
This is extremely frustrating. I'm starting to wonder if its possible that my Graphics card ports are bad?
Still though, this wouldn't make a lot of sense considering I use the DVI head on my VGA cable.

I'm running Windows 7 ultimate , tried jumping into Screen resolutions and switching where it says " display this device on " to DVI, but the only option available is VGA.

ANY suggestions what so ever?
 

zmacleod1

Distinguished
Feb 20, 2010
16
0
18,510



I've tried plugging it into both DVI ports, neither seem to read that its there.
I contacted Viewsonic Support and they basically told me that it had something to do with the screen refresh rate, which won't go above 30 hertz at 1920 by 1080.
But even when I switch down to 1600 by 1200 , I have hertz at 60 and it still wont' read DVI.

They told me to contact the graphics card manufacturer saying that the screen refresh rate didn't have anything to do with the monitor.
 

COLGeek

Cybernaut
Moderator

This sounds like tech non-support to me. Any standard LCD typically operates at 60hz. I truly think that your monitor is the problem.
 

zmacleod1

Distinguished
Feb 20, 2010
16
0
18,510
So should I take it back yet again?
When I did talk to the Viewsonic tech she told me that a lot of people had issues when upgrading from Windows xp to 7
Basically I bought a whole new system except I transfered over my old hard drive and then just installed 7 on it.
Do you think possibly wiping the hard drive and re-installing might help?
 

zmacleod1

Distinguished
Feb 20, 2010
16
0
18,510
Okay, figured it out.
It was the graphics card.
Tried hooking it up to the HDTV; wouldn't work.
So I tried one last ditch effort, and plugged in my Nvidia 9400GT to one of the other PCI-e slots, plugged in the DVI to the back and low and behold I'm running flawlessly in 1080p.

Guess that means I'll be in the market for a new graphics card because it's probably too late to send this one back, I've had it for a couple months now and I don't believe I purchased a warrantied it.