Firstly we need to drop x86, and then we need to drop the transistor.
Technological advancement for CPUs comes down to a few factors:
1) Money
2) Competition (poor competition stagnates innovation)
3) Demand for performance
4) Power consumption
5) Physical size limitations
If we ignore the first 3, since they are obvious, the latter two are what have driven us to where we are today. If vacuum tubes did not use so much power, and weren't as big, they could still be useful for computation. But because demand for performance was growing - and making computers the size of cities but slower than an Atom and more power-hungry than the whole TOP500 combined wasn't practical - there needed to be a technological shift. Hence the discrete transistor. But you can't run Crysis on a computer that requires billions of discrete transistors because power consumption and physical size would prevent it. So the IC was born.
But since then we've only had relatively minor innovations. We've gone from microns to nanometres but we're still using transistors to run our computers as we have been for 40 years. At some point we're going to need a technology shift, and it can't come soon enough.