My wife has a Samsung R540 laptop, and the DC jack went bad. I soldered in a replacement jack, but it was really hard removing the old jack. That factory solder really doesn't like to melt. After a long process of desoldering and gently urging the joints to loosen up by pulling with a pliers, I managed to remove the old jack. This put a decent amount of bending and flexing on that corner of the motherboard, so I was a bit concerned about the laptop functioning correctly when I put it all back together.
I also had to temporarily remove the heat sinks from the CPU and GPU because the cooling assembly was blocking access to some of the DC jack connections. I got some cheap thermal compound from Radio Shack. I'll admit I don't do the best job of cleaning the chips when I remove the old thermal paste because I don't do this stuff often. I used a dry paper towel and facial tissue, and did a visual inspection to make sure no particles or lint were left from the paper. I don't think I applied too much thermal compound, just enough to smear the shiny surface, and not flowing much out of the edges. (I did this twice actually - re-did the thermal compound and re-seated the heat sinks after my problem started occurring.)
Put it all back together, and the new DC power jack works like a dream. The laptop still boots up and everything is functioning fine except for one thing.
Enough background, now to the meat of the issue. Since reassembling the laptop, the screen always shuts off soon after Windows 7 boots and the desktop appears. I don't mean the screen just goes black or Windows sleeps, the whole built-in display disappears as an available option from the Display control panel! And it does this consistently after booting. The laptop screen comes on at full resolution and then fades off smoothly after Windows starts (doesn't suddenly go black, it fades out as if being switched off by the OS). If you're fast enough, you can even right-click the desktop with an external display connected and watch the display go off the list. We had not made any changes to the hardware or the drivers before or at the time of repairing the DC jack.
The ATI GPU still works with external displays -- both a VGA connection and the HDMI still work at full resolution. The GPU temperature is 51 degrees C, so I'm not worried about that. The GPU still operates at 1080p through HDMI without any heat anomalies.
The built-in display also works fine in safe mode. I tried uninstalling the ATI drivers, and that fixes the problem. Only with the ATI drivers installed does this happen. As soon as we re-install the ATI display driver the problem comes back. We were using Windows 7 64-bit, and we did a factory refresh to Windows 7 32-bit -- same problem came back immediately. As soon as Windows loaded and the factory ATI drivers took over, the screen went blank. We tried downloading the latest drivers from ATI, but their catch-all .exe failed saying the hardware is unsupported.
It seems like as soon as the built-in display is switched into full gear by the official driver, it soon disables itself. I was thinking of installing Linux just as an experiment, and it might work because it doesn't use the ATI driver, or it may not work because the GPU refuses to run the display at full color and resolution. I think I messed up the GPU in some immeasurable way, but I'd appreciate any further insight.
More about :ati mobility radeon 5470 laptop screen turns boot
I installed Linux Ubuntu on the machine as an experiment, and it seems to work okay. Likely because it's not using official ATI/AMD drivers. It is a little strange that it won't let me use the HDMI output and the built-in laptop screen simultaneously, but it otherwise works at full resolution, and the HDMI works too.
I wish there was an alternative driver for Windows 7 I could install. That would probably fix the problem. My wife needs Windows.