i7 5820k vs. i7 6700k Temps

jbseven

Distinguished
Dec 2, 2011
646
0
19,160
I'm having trouble finding a direct comparison of the two processors. Where I live, a 5820k build would only cost $50 more so it's definitely something worth looking into.

As it stands, the 5820k is a "win" in my eyes simply because it has more cores. It also has more lanes but I'm not sure if sli in x16/x16 is that much better than x8/x8.

However, higher temps would be a deal breaker for me, but its hard finding a comparison of the two since one does not ship with a stock cooler. From what I remember, Haswell 4xxx CPU's had issues with heat causing a lot of people to delid their cpus, and the 5820k also takes more power than the 6700k.

Can someone shed some light on operating temps under the same cooler?

 
Solution


You don't need it to run on x16 x16, it'll run just fine on x16, x8.
Even PCI3 x8 x8 x8 will be fine.

----

@OP, if you're gaming, you might want to get a cheaper i7+Z97 and put the money you save towards the GPU.

Antonis117

Reputable
Aug 30, 2014
434
0
4,960
The 5820K is a 140W CPU, the 6700K is a 91W CPU, so the 5820K puts out more energy and will be 'hotter' at the same cooling solution. However, neither will be especially hot with a good cooler, and the 6700K will be cooler.

As others have asked, what do you want to do with this system? What software do you plan to run? What is the rest of your system? What if any expansion plans do you have? Why do you think the choice is between only those two and does not include the 4790K?

 

G-STAR01

Distinguished
Oct 4, 2011
130
0
18,760


You don't need it to run on x16 x16, it'll run just fine on x16, x8.
Even PCI3 x8 x8 x8 will be fine.

----

@OP, if you're gaming, you might want to get a cheaper i7+Z97 and put the money you save towards the GPU.
 
Solution

jbseven

Distinguished
Dec 2, 2011
646
0
19,160
I'll be doing both gaming and media encoding in equal measure. I also enjoy overclocking which is why I'm looking at the performance : temp ratio. The intended specs will be gtx 970 (with an intended upgrade to either 980 or to 970 sli later) with 16GB 2666 @15 RAM (intending to reduce timings). I don't see any reason to go with Haswell that had a history of thermal issues requiring delidding and is also 10-15% slower than Skylake.

Just to point something here. 5820K has 28 PCI Express lanes which means sli or crossfire is not running on x16/x16.
Thanks, I misinterpreted that one. Although I'm still not too sure if there is any difference between x16/x16 and x8/x8 in terms of performance.

The 5820K is a 140W CPU, the 6700K is a 91W CPU, so the 5820K puts out more energy and will be 'hotter' at the same cooling solution. However, neither will be especially hot with a good cooler, and the 6700K will be cooler.
It would be nice if W was the only factor affecting temps but there are others- the 6700k uses moderate performance tim under the lid while the 5820k is soldered. The only comparison I could make between the two comes from these reviews that use the same cooler, but different specs:
6700k Temps
5820K Temps

Based on these charts, I'd say the two are pretty similar in terms of performance/temp, although this is still at best an educated guess. Based on price, the 5820K comes to $75 more than the 6700k (including mb, cpu cooler, psu) which would be worth it imho.

I think I'll be going with the 5820k unless I find a better comparison between the two showing a significant difference in thermal performance.
 

Antonis117

Reputable
Aug 30, 2014
434
0
4,960


Yes but you have a restriction that 5820K can't support 3-way SLI. For 3-way SLI you need a 40 lane cpu because nvidia demands at least x8 on a gpu. On the other hand AMD does not have such restrictions. But anyway these are just for information it is not our point here.
 

Antonis117

Reputable
Aug 30, 2014
434
0
4,960


Yes I don't think there is much difference between x16/x16 and x8/x8 or x16/x8. At least on real-world applications. On a benchmark perhaps you will notice the difference but not great things. And yes I agree with the 5820K.
 

LookItsRain

Distinguished


You can do 3 way sli with 28 lanes, 3 x 8 =24, 5820k has 28 total lanes.
 

Antonis117

Reputable
Aug 30, 2014
434
0
4,960


Not every mobo supports it and since that you can't say it is supported.
For example here is what asrock says:
*If you install CPU with 28 lanes, PCIE1/PCIE3/PCIE5 will run at x16/x8/x4.

**If M.2 PCI Express module is installed, PCIE5 slot will be disabled.

***If you install CPU with 28 lanes, 3-Way SLI™ is not supported.

****To support 3-Way SLI™, please install the CPU with 40 lanes.
 

LookItsRain

Distinguished


Thats a mobo limitation, not an intel CPU or chipset limitation, has nothing to do with the 5820k.
 

Antonis117

Reputable
Aug 30, 2014
434
0
4,960


Yes this is a mobo limitation with 5820K. Which means not every mobo supports 3-way sli with 5820K. I didn't say that it is a cpu limitation. 5820K can support 3-way sli but on a mobo with x8/x8/x8.
 

LookItsRain

Distinguished
"Yes but you have a restriction that 5820K can't support 3-way SLI. For 3-way SLI you need a 40 lane cpu because nvidia demands at least x8 on a gpu. On the other hand AMD does not have such restrictions. But anyway these are just for information it is not our point here."

You stated the 5820k does not support 3 way sli, in which it does, i just simply proved why this statement is false.
 

Antonis117

Reputable
Aug 30, 2014
434
0
4,960
When I said 5820K "does not" support 3-way sli it was because it is supported by some specific mobos. So when something is not supported by every mobo you can't say to someone "go ahead it supports 3-way sli". What if he gets a mobo that does not support it? Besides don't think there is any point trying to find out if 5820K supports 3-way sli or not. It does but not with every mobo. And yes I admit that my previous statement may be looking false because I didn't mention the limitation of the mobos.
 
Just how far do you plan to overclock. ??? The thermal issues toy mentioned as regards to TIM and de-lidding apply only at voltages too high for a long and happy CPU life. In addition, almost all of those issues where when the CPUs were first released. The manufacturing process has been streamlined and improved sin ce then and the performance of CPUs has improved accordingly.


Thermal limits, whatever they might be, can be 'dealt with' buy better cooling. It seems that Skylake is more voltage limited, and there's nothing much that can be done about that.
 



I say that you are right. A CPU does not support any SLI. It is the motherboard and the controllers on it that support SLI. If a CPU supported SLI, then SLI would work if the chip were used, and that is not the case. That's how the word support is used in this computer context.

A CPU might have insufficient power to drive SLI GPUs, but that's another issue.
 

jbseven

Distinguished
Dec 2, 2011
646
0
19,160

That's a tough one since I'll be trying to balance performance, temps, and lifespan. I will be overclocking until I hit a thermal barrier (70-75C) or until the system remains unstable despite +volts. I'll start with an nh-d15 and buy LC depending on how the oc goes. It is just a hobby for me so after seeing how far I can take it, I will likely return to a 'balanced' oc in the end.


Are you saying that skylake now uses solder or better tim? I'd love to read up on any improvements since launch if you have a link. It stands to reason that soldered ihs will perform better than your average 8w/mK TIM.


Just for the record, I do not intend to get anything more than 2-way sli. I'm not made of money guys :)

But there seems to be a cpu limitation (max pcie lanes = 28) which can be overcome by getting an appropriate motherboard for 3-way.





 
No. Skylake is more thermally efficient. Solder and other TIM solutions matter most at high voltages and temperatures that you are unlikely to be reaching in your quest for speed and long-life.

Have you read through the reports of Skylake overclocking? Manyt overclockers are running out of voltage before they run out of thermal headroom. The issues you mention are worth a 100Mhz or less, most likely. You have yet to convince my why they should be an obstacle.

Yes, if you want to get to 5.0 or more Ghz on Haswell, you need every break you can get, but that is extreme. My 4690K would run at low70 degree temps under heavy load, at 1.378V, and 4.8Ghz. It runs very nicely at 4.6Ghz, and anything more is of very limited benefit.
 

jbseven

Distinguished
Dec 2, 2011
646
0
19,160

I'm not sure what you mean by thermally efficient since "the die size ... is not large at all, ....but its thermal density will be high, which means that it will not be easy to cool it down." Secondly, NGPTIM's performance is remarkably poor when compared to even Arctic's MX2. Are you simply referring to the TDP?


This would definitely be something to consider. But taking kitguru's reviews (5820k, 6700k) as a rudimentary sample, the 6700k maxes out at 4.7Ghz @ 1.40v, 76C, while the 5820k maxes out at 4.5Ghz @ 1.28v, 76C, both using the h100i cooler. At these settings, the 5820k performs better in cpu oriented tasks than the 6700k while gaming performance is pretty neck and neck.

With air cooling, the temperature barrier would likely become more evident, and my only concern is the performance/temp ratio. That is, with 70-75C reached after OC, which would perform better. As it stands, it looks like the 5820k.
 
No. I'm referring to the energy dissipation per unit area. The Skylake is a lot smaller and it's energy density is higher.

You seem to have done your reading and you know, far better than I do, what your needs are. In that Kitguru review, the 6700K could go a little hotter, but there is likely no more voltage available, so it ran out of voltage first.

The 5820K is a great chip and for other than gaming, is capable of great things, and holds its own in gaming, where it is cost inefficient.