I have read that a 5 year experiment was performed that resulted with that modern processors lose megahertz over time. Is this true?
What is the best overall bench marker that tests ram CPU graphics etc? As I want to conduct say 3 benches then do it again in a year and see if there any differences whilst using a ghost image of the original windows installation from when the bench was made to give a more accurate result.
Is there any truth in this that cpus lose power overtime?
the timing of the cpu is controlled by the system clock, so the only way to loose mhz is for the clock to be slowing down, i doubt this. Perhaps with old capacitors and such the voltage regulation is getting worse, i just can't see any major affect.
The real slowing down will probably by more imagined, new software designed for higher speed cpus (we're still following moores law afterall).
I doubt it. The CPUs just seem slower when the programs you use become bloated with patches. Heck my K6-2 400 @ 500MHz was always consistent with 3dmark and other benches up until I gave that system away in 08.