As transistor counts grow, the CPU will do more and understand what you want and need.
In an interesting interview with Intel's Paul Otellini over at PC World, which does not unveil any big secrets, the points out Intel's view of ARM and its idea of the future of computing, as well as thoughts what could be done with the rapidly increasing transistor count of processors.
Of course, as CPU dies shrink, we have come to expect lower power consumption, and greater performance overall to result in better efficiency. That has been the case since the departure from Netburst in 2006, but we have also seen GPU features to be integrated on the die and there is even more budget in the future to add new capabilities.
The feature Otellini stressed was cognitive recognition, which is a capability for the CPU to figure out - in context of all available data and your history of using the computing - what your current needs and wants may be and figure it out before you do. It is not a new idea: we have first heard about it at IDF Fall in 2005, where Intel explained the concept of user aware computing in its future technology keynote. Back then, the technology was forecast to become available within five to ten years and it is clearly worth noting that the topic is coming up again, even if it is under a different name.
The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve.