Does Temperature affect performance?

Does 10 degrees in temperature affect performance?

  • YES, DUH!!! You're a noob.

    Votes: 7 38.9%
  • IS HE NUTS?! No Way Sir!

    Votes: 11 61.1%

  • Total voters
    18
Status
Not open for further replies.

customisbetter

Distinguished
Apr 13, 2008
1,054
0
19,290
Ok, so my frind and i got into an argument after he said,
"yeah, my PCs run much better in my basement".
and i replied with
"WHAT THE ****"???
"yeah its cooler down there so they run quicker"

thats not verbatum, but you see the point of my confusion. Is it possible for a 5-10 degree F difference to affect performance?
Note, nothing is overheating and throttling down.
 

brendano257

Distinguished
Apr 18, 2008
899
0
18,990
I voted "no" because it does not have a direct impact. In a cooler environment more OC'ing is possible. The only way there will be an effect is if the computer is running at temps that are throttling, or damaging components, in this case cooler temps would increase performance by not damaging parts, but other than that it would increase longevity on OC'd parts. I run my PC in my basement because of space, I don't think it will make a very large difference however.
 

grieve

Distinguished
Apr 19, 2004
2,709
0
20,790
I voted 2. IS HE NUTS?! No Way Sir!

Assuming both machines are under the accepted temperature of the system (example: 65 degrees under load)

Less temperature is superior for pushing the system via OC.

If you and I are playing a game with the same machine setup exactly except yours is running @ 30 degrees and mine is @ 40 degrees, there will NOT BE a FPS difference.
 

dobby

Distinguished
May 24, 2006
1,026
0
19,280
this one made me laugh a little. i know plenty of friends who think stupid things are computers and are to stubon to ambit different.

heat only affect the performance when it gets so hot that the motherboard start throttling the CPU to cool the system down.

on the over hand you can overclock further if the compeonts are cooler, but that assume that he has overclocked so far that heat is the barrier.

however if i had a 24/7 server i would put itin the coolest least used place in the house, because coolerparts last slighly (slightly) longer.
 

customisbetter

Distinguished
Apr 13, 2008
1,054
0
19,290
THANKYOU THANKYOU THANKYOU!!!
he has never overclocked and prolly never will. he is a mac guy if that counts for anything.
i proposed to do the 3dmark and pcmark tests but i haven't gotten around to it yet.
 

darkguset

Distinguished
Aug 17, 2006
1,140
0
19,460
Ok, here is the thing:

In theory even the slightest of temperature rise will affect performance, even a 0.0001oC rise will make an electronic component slower. That is because all the design and materials behave under Ohms laws and specifically the one that says: R=px(l/s) and V=I*R, hence increasing the temperature, you increase the resistance of the electron flow through the "pipes" or electron channels that make your CPU or other electronic components.

Now in reality, a 10oF, with no throttling involved will only make that computer 0.00000001% slower. You can imagine that this is not even close to traceable. But IF you could perform the exact same tests (impossible unless you have a time machine!) on the exact same PC with the exact same temperatures and all the rest variables, maybe you could see difference in the PI calculations, probably yielding you a 0.000000001% faster result.

So the bottom line: In theory it affects it. In practice, the effect is so small, it is negligible.

So your friend is basically right (in theory), but in practice you are right because the difference is so small that is almost zero. Hence you are both right, ask your friend to accept the reality (physics is nice, but you have to consider the applications in the real world as well), and you learnt a physics lesson!
 
it does not run quicker period. If you think it does then you need to sell your computer as you do not deserve the right to own one.

It does however run more reliable as there is less resistance to wear down the components.

Huge difference.
 

the_one111

Distinguished
Aug 25, 2008
390
0
18,780
Your friend obviously doesn't know much about computers..

then again most macophillias think they know alot about the 'tech world and they dont....
 

sailer

Splendid
About the only time I could see 10c affecting performance is if the computer was already running close to maximum temps and the extra 10c pushed it over the edge. Yeah, it might affect an overclock as well, but not everyday performance.
 
Darkquset - For clarification, Metals have a Positive temp coeffient that is as temp goes up Resistance goes up as You stated. This is why a light bulb works and does not go poof like Johns moustache. However, most semiconductors have a negative temp coeffient, Which can result in thermo runaway, And it does go Poof.

My vote - You would never notice any NOTABLE change as long as max is with in reason.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
Aside from theoretical lowering in performance due to increased resistance at higher temperatures, it also tend to cause more calculation errors and lowering stability. Although your PC hit minor errors all the time and recover instantly without you noticing anything. This takes a toll on performance, but nothing remotely noticeable. The system would've crushed long before performance loss from this effect makes a dent.
 

customisbetter

Distinguished
Apr 13, 2008
1,054
0
19,290
yeah, we just discussed this again. he misinformed me on the situation. turns out he had the PC in his upper floor W/O A HEATSINK! thus it didn't run, duh. so he took it into his 10 degree colder basement and put a boxfan over the socket and WOW it runs!! (Pentium 3 i think) he did this b/c he didn't have sticky thermal paste. Sorry this was kinda stupid but hey, i learned some physics in the process.
 

senor14

Distinguished
Jan 26, 2012
2
0
18,510
Yes, temperature does affect performance. Those who think it do not have never really dived into how electronics work and really shouldn't be allowed to give advice on this subject. In short, it keeping PCs cooled allows the hardware to perform at their maximum level, if their firmware allows them to do so and can allow them to run at higher speeds without concern of overheating (overclocking).

1. Run a PC with fans that have under 100 CFM and tell me how fast your computer runs after 10 minutes. Now run a OC/non OC pc with water cooling. Electronics do have a maximum settings for speed, but it is never constant...it fluctuates over and below its advertised speed.
2. Overclocking relies on the limitations of the hardware, as well as cooling efficiency. If you're not overclocking, your electronics need to be cooled regardless, but does not require the amount of cooling that over clocking demands.
3. High temperatures can have an impact on the components ability to perform calculations. Yes, a temperature of ~100C will outright fry nearly any hardware device, impair its on board abilities or if you're lucky -- have it shutdown to prevent damage.
4. Excessive temperatures wear out hardware faster than you think. Keeping everything below 45C ensures that the part will never become unstable.


Yes, servers also need adequate cooling in high load times in order to maintain efficiency.
http://www.extremetech.com/computing/90362-what-do-supercomputers-and-overclockers-have-in-common-water-cooling
 
Heat and dirt are the culprits in electronics and mechanics.

If you run a car engine hot without venting and cooling with airflow your performance will decline, up to the point the engine goes boom, clang, bang, This is not that disimilar in electronics.
 
Status
Not open for further replies.

TRENDING THREADS