Integrated Graphics Interfering With Graphics Card and Temperatures?

shacoa

Honorable
Sep 26, 2012
15
0
10,510
Hello, and thanks in advance for help.

I've been having quite the experience with my computer lately, as any one who recognizes my username can tell. Long story short, had to RMA my graphics card, and combed my entire computer for bad parts. Got my new graphics card, and shortly thereafter my usb for getting wireless internet died, though that's unrelated.

While I was waiting for my RMA, I used my integrated graphics card. Now that I got my GPU, I notice my temperatures are a lot hotter than before, which is slightly the weather, but still higher than I've ever seen.

Someone mentioned on another site it might be my integrated graphics interfering, as both my GPU AND my CPU temperatures are hotter than ever.

Does this have any merit? Is there a way to clear my integrated graphics?

I also heard I should have deleted my integrated graphics drivers.
Finally, should I have completely deleted my AMD drivers and then reinstalled them with the new graphics card?

Sorry for all the questions, and thanks in advance for any help!
 
Temp's are higher than before what? Before when you had a video card in the first, or before when you were running the iGPU? What temps?

If you had the same model of card in before, took it out, ran integrated, put the same model back in, shouldn't have to do anything with drivers. What chip, what MB? Some disable integrated graphics automatically when a card is in, so there is nothing to "disable". Otherwise, you can go disable them in the BIOS. If you didn't have to enable them though in the first place after removing the card, then I don't see it being an issue.

Basically, need more info.

What do you mean temps are different?

When you removed your graphics card, what did you do exactly to get the iGPU running?
 

shacoa

Honorable
Sep 26, 2012
15
0
10,510
I'm sorry, I'm usually good about being specific. I guess I rushed this question.

I have an Intel 2500k CPU, and an XFX Radeon HD 7850 Ghost Thermal Core Edition.

Before I took the original graphics card out on July 27th, my temperatures while playing Guild Wars 2 were about 60, 65 max for the GPU and 40, 45 max for the CPU while playing on max settings, and that's after playing for a while.

After July 27th, I activated iGPU by plugging my HDMI cable into my motherboard instead. Can't say I did much more, though I did play around with the boot order in the BIOS. I feel like I set it to PCI comes first though... I played like that until August, 20th (two days ago).

I got the same exact model GPU back from the RMA, and when I started playing Guild Wars 2 again, I'm quickly reaching temps of 70 to 76 C on my CPU, and 47 to 52/53 on my CPU cores.

While idling, everything seems to hang around 32, 33 C though, and once I close Guild Wars 2, the temps quickly drop back down to around here.

Hopefully this explains more, sorry about for my lack of information!

UPDATE: I should add that while using my iGPU, my CPU was getting around 45 c while playing Civilization V on the lowest settings, if that means anything.