Artifacts using DVI, none with VGA adaptor?

G

Guest

Guest
Hiya,

I just bought a Sapphire 4870 on ebay due to my old 8800GT artifacting. When i use it with my dvi cable on one of the cards dvi outputs it artifacts with green dots everywhere and when using the other dvi output the picture is normal but there is no green spectrum whatsoever.

At this point i was beginning to think my mobo might be causing the problem or maybe i was really unlucky and go sold a bad card.. BUT, if i use a dvi to vga adaptor on either on the 4870's outputs along with a vga cable to my monitor, everything is 100% fine.

What does this mean? Could it be;

1. a bad dvi cable (i hope)
2. bad dvi input on my monitor
3. or does the gpu somehow work differently when it is being used with an adaptor therefore not triggering the artifacts?

Any help is very much appreciated
 
G

Guest

Guest
well, the same dvi cable works fine with my old pc connecting to the same dvi input on the monitor.

I quickly switched the cable from the old pc the new one and it worked fine. Then i restarted the pc and there were a couple of small artifacts and now the screen blacks out constantly and after a while will black out for good until i restart.

I would now think i've been sold a bad card but why does it work perfectly when used with a vga adaptor?
 
G

Guest

Guest
to add to the above, my psu is 500w and i know some people say thats a bit low for a 4870. the psu is about 3/4 years old but as i only have one hdd and one dvd drive i would have thought i was okay. could the psu failing to keep up cause the blackouts? but if that were true shouldn't it do it when using the vga adaptor too?

Help!
 
G

Guest

Guest
one last thing, ati tool doesn't consider these blackouts an error when running the artifact scanner. I know blacking out is not an artifact but surely if it was the gpu it would register an error of some kind.
 
G

Guest

Guest
Me again. just learnt that the 4870's dvi outputs are dvi-i so they have the anaologue signal built into them as well as digital. So the graphics card probably DOES behave different depending on whether its send through an analogue or digital signal, because the dvi>vga adaptor doesnt convert the signal at all, it just feeds the analogue through to the vga cable.

Would you experts agree that i have a card that can run producing analogue signals but not with digital? What goes wrong in the card to cause this?
 

x86overclock

Distinguished
Aug 3, 2009
54
0
18,640
Sorry to burst your bubble but it's not your card at all it's the dvi port on your monitor or the dvi cable you are using believe it or not you can fry your monitor ports when you fry your video card my advice is try your dvi on a tv or different monitor if the problem persist try a different dvi cable but I bet you fried your dvi port on your monitor, mine just went out last month because my dvi cable kept falling out are getting loose then it just started artifacting one day eith the dvi port so tried the dvi cable on my tv and there were no problems tried it on my monitor again and there problems persisted so switched to the vga port and voila it works.