HDTV flickers/flashes/cuts out/ turns on/off with pc attached to it

Ok usually I can find or figure out a solution for everything, but I've searched high and low and have tried everything I could think of to get this to work.

My problem is that my HDTV picture will cut out momentarily (a few seconds) but only when I have it attached to my PC via an HDMI cable, any other device I plug into the tv works fine so I know it's not the TV.
The audio keeps running fine when it happens, but the TV panel will pop up (like only does when you first turn the TV on or when you switch inputs or unplug/reattach an input) So thats telling me that the signal is physically lost.

The problem only happens when I open graphically intense websites with video, or when I'm watching a high resolution video on full screen mode, on idle the image is stable, when I move the mouse around it glitches out longer so I'm thinking it might be a lack of graphical power. Oh and when I play a 3D game the TV mode automatically switches to 1024x768 at 75hz and everything looks good an runs perfectly fine.

The TV is a Sharp AQUOS 46" 1080p 120Hz Flat-Panel LCD HDTV, I'm running the HDMI cable directly from my motherboard through it's onboard HDMI output (motherboard is the ASUS M4A78T-E AM3). I'm pretty sure it's not a lack of processing power or Ram since I have an AMD Phenom II 965 quad core (3.4 ghz) and 4 GB of G.Skill Ripjaws DDR3 running at 1600 mhz.

My first though was that my HDMI cable was an older version what was not capable of handling the higher performance, so I bought a new HDMI cable which is the latest version (v1.4). That didn't solve it.

So now I'm thinking it might be a grounding issue since I installed the motherboard with the foam padding between the case and the output slots (like the foam on This Example). I wasn't sure if I should have installed it with this or not. Or I'm thinking I might have to buy a graphics card with an HDMI output, but why would they have an HDMI output on the motherboard if it wasn't capable of playing high res videos or content, unless they just through it in and expected people to not do much more then word processing or basic web browsing.

What do you think?
6 answers Last reply Best Answer
More about hdtv flickers flashes cuts turns attached
  1. Best answer
    Not sure which product you have as there are two. Anyway I don't know why the mobo would output in HDMI when its only equipted with a 3300 GPU, think of it when engineers first thought up of HD. Anyway if you want a graphics card a Radeon 5450 should be fine. Here are the two TVs I'm confused about, and your mobo specs.{26A2A8F0-181C-4E0E-B8B2-A89594967FE7};{B7E42F19-40F9-4D53-B7ED-AD8EDB6B85A4};
  2. Yeah thats the right motherboard, I got it off newegg, here is the direct link:

    The sharp model is LC46D85U, I just bought it last year looks like it's already outdated it's not even listed on sharps website but here is the manual pdf:

    and here is the cnet review:;lst

    Anyways, yeah I figured it's probably a graphical power problem, or lack there of. It's dumb they put an HDMI output on a board and not have power to back it up. I was looking to go with at least a 1GD Radeon HD 5750, I was trying to stick with a budget build but it's not turning out that way, I might as well make it a gaming machine otherwise it's just a really cool $600 media server that doesn't play video.
  3. You don't have to go as high as 5750 for just to play videos, try my 5450 suggestion.
    Any on the list should work with no power supply needed.
  4. I finally found the ultimate solution!

    I spent hours trying various things such as installing the latest graphics drivers, the latest directX, flashing the BIOS with the latest version, reinstalling windows 7 with the 64 bit version to get the better graphics and memory ability (which worked for a while till I started installing my needed programs i.e. antivirus) All of which failed.

    So what I finally was able to come up with sounds so easy now but It was the last thing I thought about:

    What I did was go into the BIOS under the advanced tab and found the integrated video graphics settings, there I was able to change the video memory allocation. It was set to auto (which wasn't doing it's job) so I bumped it up to 512MB just to test it out, loaded windows and success! ran 6 videos, 10 websites, 5 programs, media player all worked perfect without a hitch.

    Now I don't have to buy a graphics card unless I want to play high end games, which I plan on holding off for a while till I can afford it :)
  5. Good for you my mobo doesn't let me do that. Just make sure you 4GB is good enough to run all your programs well. A graphics card like a 5670 could really do your GMA card's work and play some games at low quality in HD. It's around $100-110 right now.
  6. Best answer selected by mrhozer.
Ask a new question

Read More

Graphics Cards TV Graphics Product