i7-8700 drawing ~100W power causing high temps

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535
I'm having high temp concerns with my cpu. Using the IntelBurnTest, at maximum load, many cores are experiencing 90+ degree temperatures. My HWiNFO censors indicate the cpu is drawing ~100W of power during maximum load tests. One suggestion received was to set a AVX offset. Though this reduces temps, it effectively reduces my cpu frequencies...not a good solution for a new system.

Any thoughts on this?

w4eccHr.png


CPU: i7-8700
Cooler: Noctua NH-L9x65
Mobo: GIGABYTE Z370 HD3P
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535


I'm using a Notua NH-L9x65 cooler...it should be good enough.

Reduce voltage...okay, though I don't see that this should be necessary for a brand new system that is rated to be 65W, etc. I have a solid case, 5 fans (3 pull/2 exhaust) that I rarely hear.

This isn't a gaming computer, though I occasionally use it to edit video, etc., so it would not get to max that often.

I should ask if the high temps and wattage I'm experiencing is normal or not.
 

rekiso102

Prominent
Oct 5, 2017
26
0
530
I’m fairly sure it’s a voltage issue. Usually overvolting tears through heat. I’d recommend going into BIOS and turning it down a bit. See if that helps, if it doesn’t it could be something else.
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535

Thank you, I'll try some settings changes and report back. What does reducing the voltage do...are there any negative side-effects? I would imagine reduced frequencies?
 
Just because a cooler has Noctua name on it, does not make it able to contend with 100-130 watts of heat, and, rest assured, if you are running MCE on (all cores at 4.6 GHz), you are generating nearly the 8700K's same 130 watts of heat. Heat goes up even more drastically for each additional 100 MHz and .025 V of core voltage...

Noctua's cooler in question, NH-L9x65, is a low profile, single fan, smallish cooler....

Do not expect it to contain/dissipate temps quite like an NH-D15 which is 3 times that size with 2 fans each twice as big with twice the airflow....
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535
I had also contacted Intel about this issue, and according to Intel, it's really no issue at all...my processor is performing within normal parameters. If thermal throttling started to be enabled, then there may be a problem. Since I rarely tax the system, I'm going to stick with what I have and only react if it starts to be throttled.
 
As long as your normal load temps are not hitting 90-95C routinely, you should be fine; if you are hitting 85-90C in gaming, it's not cool enough, IMO...

GIgabyte also routinely enables MCE (all cores at max turbo) in it's BIOS by default? If your cooling is not quite up to that sort of load, you can disable MCE in BIOS
 


Exactly, I have one on my 3rd Gen i5 system, it is rather small and low profile.


I suggest upgrading to a stronger cooler like a NH U12S or NH U14S.
 

Karadjgne

Titan
Ambassador
You'd have to understand what TDP for a cpu really is. TDP is thermal design power. Not heat output, although with as close as the results generally are, it's considered the same thing for cooling. TDP is assigned to a cpu which is subjected to a specific series of apps run, those apps are very mediocre. Very. TDP is what you'd expect under nominal usage, office apps, win zip, video by igpu etc. It's not by any measure Peak power, which runs @1.5x to over 2x TDP of the cpu under Extreme usage. An i7 is fully capable, especially the 6c/12t 8700, of a full power draw in excess of 130w when all its threads are maxed. Not the 65w TDP.
Quote Noctua :
While it provides first rate performance in its class, it is not suitable for overclocking and should be used with care on CPUs with more than 65W TDP 
End quote:

The L9x65 is a 95w cooler, max. You are sticking in excess of its rated heat range. It's going to fail fast at those temps. For nominal usage it'd be fine but under extreme stress of a torture test like IBT, it will fail. No ifs, ands or buts. It is not designed for that purpose.

The only way to bring down torture test temps, and the @70% gaming temps is replace the cooler with something that is designed for such usage. That would be at least a 140w able cooler such as the hyper212, cryorig H7, Noctua U12S or Corsair H45/55/60 AIO's, or better, such as a cryorig H5, Noctua NH-D14/U14S, beQuiet dark rock pro 3/4, Scythe Fuma Rev.b or Corsair h80i v2/h90 etc
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535
This rig is not for gaming. I've had HWiNFO scanners running constantly and the only time I really see these high temps is in video editing. That said, I probably should reduce wattage during turbo mode.

Although I do not have anything in my bios called MCE, there are several other settings, such as those below. What would be my best course of action here to ultimately reduce the watts drawn when in turbo mode?

I see headings such as Turbo Ratio, Power Limit TDP, Turbo Per Core Limit Control, Voltage Optimization.
 

Karadjgne

Titan
Ambassador
Gaming and production are 2 different beasts. Gaming will generally use @55-70% of 2-8 threads. Production apps like editing, rendering, compiling can and do use upto 100% of anywhere upto @32 threads if available, as such are massively more cpu taxing than most every game. Running a 1hr cpu render is pretty close to the same as running a 1hr stress test. Depending on the software used.

So yes, it's quite possible, even probable that just some simple editing will use upto all 12 threads at times, far beyond the relatively low cpu usage of most games. This is going to drive temps way up.

Easy way to test this is run your editing, with Hyperthreading enabled, watch the temp. Then rerun the same thing with Hyperthreading disabled, so only using 6 threads, not the full 12. Temps will be considerably lower.
 

ScottForum

Distinguished
Apr 29, 2014
58
2
18,535
I did a render that took approximately 25 mins, without any core adjustments. During that time, half the threads temps were 90+, with a couple reaching 94-95. Upon presenting this info/screenshots to Intel directly, they did not have any issue with the performance and stated the processor was running within normal behavior. Is there still reason to be concerned?

 

Karadjgne

Titan
Ambassador
Yes and no. At those temps you'll more than likely start to detect thermal throttling. This will be an issue for you personally as it slows down your render etc. As far as the cpu goes, it will only be an issue if those temps are regular and maintained, basically if you plan on rendering daily for that amount of time or longer, the only temps your cpu will see are extreme, which'll start having lifespan effects and stability issues after a long while. For the odd render, once a month for 15 mins or so, no worries.

Honestly, I'd put your own mind at ease, lower cpu temps considerably and toss the little cooler in favor of something more appropriate for the usage. Any of the 140w+ budget coolers will bring those render temps back into the 70's, just don't opt for the cheapest as more often than not they are stock replacement coolers and you'll still be where you are now.

Even a really good cooler like the 250w+ Scythe Fuma Rev.b is actually cheaper than your current Noctua and the difference in load temps will be night and day.