Sign in with
Sign up | Sign in
Your question

Screen flickering!

Last response: in Graphics & Displays
Share
August 1, 2011 9:18:08 AM

I added a Zotac ZT-40604-10L graphics card to my i7 machine. Everything ran fine for a day or so, then the screen started flickering whenever I did anything remotely video intensive (ie. even just playing a normal HD video clip in WMM!). The best way I can describe it is, it looks as though the monitor were constantly updating the resolution of the screen. It just keeps flashing. Taking the card out and switching back to the integrated video fixes the issue. Any ideas what might cause this?

More about : screen flickering

a b U Graphics card
August 1, 2011 1:40:19 PM

Have you done any overclocking on the card? Sounds like major artifacts, which you normally get when the card is over-overclocked.
Also, how are the temperatures? This might happen if the card were seriously overheating (110C+). However, a temperature like that would be just as big of a problem as artifacts in HD videos.
If you've overclocked, go back to stock clocks. If not, RMA.
m
0
l
August 1, 2011 7:11:18 PM

The machine is overclocked (running @ 4.5ghz), with stock temps at 37-40. The card was also overclocked and ran fine for a while before the problems began. These are the steps I tried to isolate the problem:

1. Reset graphic card to stock clocks - problem remained
2. Reset both CPU and graphic card to stock clocks - problem remained
3. Plugged monitor into integrated video DVI port (stock clock for CPU) - no video at all
4. Removed graphics card and plugged into integrated video RGB port (stock clock for CPU) - video working fine
5. Overclocked CPU back to 4.5ghz with graphics card still removed, monitor plugged into RGP port - video working fine

That's where I'm at right now. I'm just wondering whether the graphics card is dead or something? I always watched the temps closely. The CPU temps never went past 90 under load and the graphics card temp never went past 60. I'm also wondering why the integrated DVI port would stop working as it did.

m
0
l
Related resources
a b U Graphics card
August 1, 2011 7:14:06 PM

Integrated DVI always stops working; the graphics card supplies its own output.
Your 2500K (2600K?) got that hot? I'm surprised and a little worried. Are you on stock cooling? You shouldn't be.
Yes, it sounds like the card's broken, but try underclocking it first just to see if you can stop it. Even if that works, RMA to get a card that performs as advertised.
m
0
l
August 1, 2011 7:39:22 PM

Thanks for the response!

And yeah, I'm actually still on stock cooling (2600k). I guess I've been reading so many different things on cooling and temps, some saying that 37-40 is fine for idle. It doesn't go above 50 unless I'm rendering videos and things like that. Have I been misinformed?

Thanks for the suggestions about the graphics card...I'll give it a try!
m
0
l
a b U Graphics card
August 1, 2011 7:45:24 PM

But when you render videos it goes to *90*? Did you switch the GPU and CPU temperatures?
m
0
l
August 1, 2011 8:29:11 PM

Oh no, the CPU core temps went to 90, the video card temp never went above 60.

And I'm not sure if this means anything, but I'm trying to remember the exact steps I took to get it working again, and I believe between steps 3 & 4, I removed the CMOS battery in hopes that it would fix things.
m
0
l
a b U Graphics card
August 1, 2011 8:44:01 PM

Well, I would think the absence of the broken external GPU would be the factor that would fix it.
m
0
l
August 1, 2011 8:46:57 PM

Oh and one more thing, the computer did work fine with the Zotac when not doing anything intensive, so I guess the card isn't completely broken, though the moment I watch videos or try to render, it would start to flicker.

Cards can be partially broken like that?
m
0
l
a b U Graphics card
August 2, 2011 5:20:20 AM

Apparently lol
For whatever reason, the card is failing under load. This is what happens when you overclock a card too far: artifacts appear, not in everyday usage, but under stress. With you, it's just happening without adjustments, which is not acceptable.
I put up a thread about tolerance for GPU artifacts a while back, with the overwhelming opinion being that no artifacts can be tolerated. http://www.tomshardware.com/forum/266903-29-artifacts-o...
m
0
l
!