Honestly, I do not know why you are all so fixated with the lowest possible temperatures in the CPU die. What the hell do you care if a core is at 50C or 70C? If it doesn't crash under stress testing - what possible difference can it make?
Look, I know the physics of the CPU heat death - and the high temperatures in modern CPUs under overclocking have essentially no impact on usable life span of a product.
Sure, cooking your CPU with overclocking will eventually kill your CPU, BUT (and it's a big but!) the life time will merely have dropped form 15 years to 5 years.
Are you telling me you plan to be using that CPU in 5 years from now? Hell, even if you baked it to within an inch of its life, you still wouldn't be using that CPU in 3 years from now.
Users need to understand that it is HEAT CYCLES which break a CPU. The maximum operatign temperature really doesn't have much effect on life span, EXCEPT insofar as the increased temperatures are caused by increased voltage supplied to achieve stable overclocks. The over-volting of CPUs is the most harmful thing you can do to them. This hastens the process of electro-migration and the voltage literally starts pushing metal ions around the core, eventually killing thousands (or indeed, millions) of transistors in the core. CPUs are highly fault tolerant, (You can burn multiple holes right through a CPU using a laser, without killing the core) so they'll keep calculating successfully for a very long time, albeit at reduced speeds, which is counter to what you wanted when you overclocked in the first place...
If you run a CPU hot, it is better to leave it running 24/7 than to turn it off at night, and start it in the AM: that's just adding another damaging heat cycle to the core.
So, the answer to the big question is this: it doesn't matter what temperature your CPU is, provided that the CPU is stable.