My build is as follows:
i5 3570k @ 4.5Ghz
Asus P8Z77-v LX
8 GB ram
HD 7950
(Latest BIOS update as well)
So, I use my 7950 on my monitor and other monitors at my desk. Which works flawlessly. However, the problem comes in when I use the HDMI on the motherboard (which is powered by the i5 3570k iGPU) on my television. I get artifacts and weird stuff on the screen, just like a burned out graphics card. But this only happens when my i5 3570k is overclocked. Which again, is completely stable, I have tested it with prime for over 24 hours, tons of intelburn tests run, played countless hours of games on it. When the processor is at stock clocks, I have no issue with the video. Runs smooth, no artifacts, or anything abnormal.
Considering all of this, I have manually tried to add offset volts to the iGPU when the processor is overclocked and can't get rid of it. It honestly usually only makes it worse. So, I'm kind of stumped and would love some help on the matter.
Thanks!
i5 3570k @ 4.5Ghz
Asus P8Z77-v LX
8 GB ram
HD 7950
(Latest BIOS update as well)
So, I use my 7950 on my monitor and other monitors at my desk. Which works flawlessly. However, the problem comes in when I use the HDMI on the motherboard (which is powered by the i5 3570k iGPU) on my television. I get artifacts and weird stuff on the screen, just like a burned out graphics card. But this only happens when my i5 3570k is overclocked. Which again, is completely stable, I have tested it with prime for over 24 hours, tons of intelburn tests run, played countless hours of games on it. When the processor is at stock clocks, I have no issue with the video. Runs smooth, no artifacts, or anything abnormal.
Considering all of this, I have manually tried to add offset volts to the iGPU when the processor is overclocked and can't get rid of it. It honestly usually only makes it worse. So, I'm kind of stumped and would love some help on the matter.
Thanks!