The title basically says what I'm asking; will processors eventually stop getting better?
If you think about it, the only reason processors get faster is because they fit more transistors in the same area. But now we're down to 22nm for one transistor, at that point quantum mechanics starts to cause problems... The electrons interfere if they get too close. Sure we can make the transistors smaller... but they have to get farther apart. Also, once we get to a one atom transistor, how do we make it faster?
Basically, once we hit the point where transistors are one atom... is there any way to make the CPU faster? Or will that be the end of moore's law?
There's still a lot of things you can do with a CPU. Tri-gate transistors continue to mature, which raises efficiency. Power will go down, which will allow us to go to 5nm by 2020. Quantum computing will probably be real in 10 years, and eventually, we will reach computers that are similar to the brain.
Even if we can't go smaller we can go more advanced. Better systems better architecture.
Theoretically, if we just had the design, a 60nm CPU could best a 32nm sandy. If the design was THAT DARN GOOD. It's all about the implementation. We can always decrease latencies and make smarter CPUs, regardless.
Aside from that, there is optical computing using light we can sen multiple streams of data both ways, using electrons we can only go one way (on off). The smallest possible particles making binary calculations across the shortest distances. And then quantum and then brain and then ..why not? SINGULARITY COMPUTERS. Sentient singularity computers.
Not quite sure you can go smaller than that.
..although that's science fiction haha!
There is always a chance a group will come up with a better way of computing in general that uses a more efficient/complex language than binary etc
AMD has 16 core processors. 24 and 32 can't be far off. We just need batter multi-threaded software and more CPUs/Cores to continue as is. I mean really, as far as single core performance, little progress has been made in the past few years.
Also, advancements are being made in things other than transistors. Eventually some different tech will take off.
I think the question is will demand for computing power continue to increase. It's already leveled off to the point where most people are happy with the performance of a phone or tablet. It's just a few of us geeks and power users that really care.
It's really a weird question. Every time you think they can't make a toilet better, someone finds a way. You ask about processor of the future, and then try to limit the answer by measuring them traditional processor rules. There may come a time when silicon based processors are out dated, and moore's law would become obsolete. Does a tri-gate transistor count as 3 when measuring for moore's law?