I think that processors and architectures will continue to improve and the programs that utilize the processing power will continue to demand more of cpus. If you run a simulation with 10 variables 100 different times changing variables each time, you will probably get a good idea of the likelihood of an event happening as a result. If you have the processing power to run the same simulation 1000 times, you would probably get a much more accurate idea of that likelihood. Just as an example.