I purchased a Radeon HD 4650 (by Sapphire) graphics card over a year ago after asking for recommendations on this board. I was upgrading from an integrated card and had serious space and power requirements for the upgrade. The card worked great for me (some gaming, mainly Photoshop and 3d modeling/rendering) for about 14 months now. THEN...the last time I used this PC, the screen would flicker very randomly. Today I turned it on and it loaded fine until the windows startup/login screen. As soon as this appeared on screen for a second, the display would go black. Logging in with a blank screen and having windows start up did not make anything appear.
I restarted in safe mode, and everything loaded and appeared onscreen normally, except for the resolution being extremely low (running off of the integrated card). I checked my display adapters (both the integrated card and 4650 card) and did not show any problems. I reinstalled the latest HD 4650 driver, and after enabling/disabling display adapters and restarting a few times, I got the display to show up as desired (the same as it was when the 4650 functioned normally). I made the PC sleep and brought it back to see what would happen - now I have a blank windows login screen (shows up for a second then permanently disappears) and once I login, my desktop shows up then the screen starts to flicker in a very set pattern. Its almost as if the screen shows up for a second, reduces resolution for a second, then turns off in a relentless cycle. Is the card toast? I am now running off of my integrated card without any issues. I should mention that I was/am using the same HDMI cable run through the 4650 card this entire time. At $50, I wouldn't be heartbroken to get another card - but would this even fix my issue?
Now thoroughly confused - seems to work fine for extended periods, then once the computer sleeps and I bring it back, same issue starts. I've reinstalled drivers 3 times. Any suggestions? ANYONE? I'm about to shoot this thing..
So for an update, and the solution to my problem..
Turns out the flickering, cutting out, and false hopes after messing with drivers can all be attributed to the failure of an HDMI cable. I finally found this out after building a new PC, installing Windows 7 then the MB and GPU drivers..I rebooted and as soon as the Windows login screen loaded, the screen went black. I was using the same monitor and cable as on the older desktop (the one with the issues).
After trying a different LCD monitor, I swapped the HDMI cable running from my GPU to my monitor. Instant success. The bad cable would not work with a newer LCD, and would not work between an Xbox and an HDTV, but it would display low resolution screens on my LCD monitor. It would also display in high resolution sporadically as long as there wasn't anything too intensive onscreen. For instance, I could sometimes display my desktop in high resolution, but the screen would then cut out if I loaded Photoshop or a high res image in the windows image viewer. Hence the mistake of blaming GPUs and drivers. My new PC is running great with a new HDMI, and the older system is fine now (with the original GPU) hooked up through a DVI cable.
I suppose I should take this as a reminder to put as many details as possible into my original question (mainly the fact I was using an HDMI cable) so someone with this problem might have caught it. Honestly that was the last thing on my mind in the beginning as a point of failure.