The Next Big Feature For Intel CPUs is Cognitive Recognition
As transistor counts grow, the CPU will do more and understand what you want and need.
In an interesting interview with Intel's Paul Otellini over at PC World, which does not unveil any big secrets, the points out Intel's view of ARM and its idea of the future of computing, as well as thoughts what could be done with the rapidly increasing transistor count of processors.
Of course, as CPU dies shrink, we have come to expect lower power consumption, and greater performance overall to result in better efficiency. That has been the case since the departure from Netburst in 2006, but we have also seen GPU features to be integrated on the die and there is even more budget in the future to add new capabilities.
The feature Otellini stressed was cognitive recognition, which is a capability for the CPU to figure out - in context of all available data and your history of using the computing - what your current needs and wants may be and figure it out before you do. It is not a new idea: we have first heard about it at IDF Fall in 2005, where Intel explained the concept of user aware computing in its future technology keynote. Back then, the technology was forecast to become available within five to ten years and it is clearly worth noting that the topic is coming up again, even if it is under a different name.
The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve.

OPEN THE BAY DOORS, HAAAL!!!!
I'm sorry, master_chen. I'm afraid I can't do that.
I don't see the point.
OPEN THE BAY DOORS, HAAAL!!!!
I don't see the point.
I don't see the point.
Processors and chips have instructions sets and firmwares, which are basically a kind of software too (just build-in).
While I hate to admit it, I think he's wrong here. Far too many people have allowed the smartphone to be their centerpiece of computing and their lives. It won't ever be that way for me, and people will still have PCs and such, but they are living off their phones today. That will only get worse.
Correction..
Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the point
I hope I'll never need a machine to tell me what to do next..
What I think he meant is that most of the more advanced computing capabilities will not be processed in smartphones. Even if smartphones will be the "centerpiece of computing" for most people and PCs will eventually die, most of the number-crunching will be done in servers and will be transferred to your device of choice. This will become more apparent as internet bandwidth and speed will grow, but you can already see some services offering advanced capabilities such as Dropbox and Onlive
I'm sorry, master_chen. I'm afraid I can't do that.
...now hit "submit" to continue or F1 if you need help :-)
But your keyboard is not recognized by the computer. :-)
my thoughts exactly
skynet , silicon messiah , matrix and I Robt are a few things that come to mind when disccsuing a chip recognizing any thing past some numbers i throw at it. certainly don't need my computer deciding it's time to play solitaire just because my mom usually plays a game of solitare every day in the morning on my pc.
In the end, I think this guy paul is just bored and shooting crap out his rear end because humans are extremely unpredictable from a computer stand point...
Intel: Taking credit for software developer's work. Well done.
I don't see the point."
Placing this function into the CPU makes it (the function) OS neutral.
Not really since in order to make any predictions about what the user wants to do next, the CPU would need to receive information from input devices and other stuff managed by the OS and the CPU would also need ties into the OS to act upon its predictions since it would otherwise have absolutely no cue about whether it is processing a keyboard press, mouse movement/click, on-screen button, etc. nor have any clue about how to act on any of those.
Considering how complex the data-hoarding/mining algorithms may be and how intimately the prediction algorithms may need to interact with software, this sort of feature is fundamentally impossible to implement in hardware. At best, they may implement new instructions, multi-threaded and GPGPU libraries to optimize those types of algorithms at the OS and application levels.
skynet , silicon messiah , matrix and I Robt are a few things that come to mind when disccsuing a chip recognizing any thing past some numbers i throw at it. certainly don't need my computer deciding it's time to play solitaire just because my mom usually plays a game of solitare every day in the morning on my pc.
IMO what is meant by "cognitive" is that the computer will be better able to interpret ambiguous input or commands from the user, not tell the user what to do. Sorta like anticipating the user's needs, instructions, wants, etc. This could prove useful in the near future when voice/command recognition is more widely used.
The idea is that you could tell the computer to check a document for errors and send it to someone. The computer would be able to recognize that you mean spell and grammar check, export to PDF or some other document type that the other user can accept (because cognitive understanding can be implemented for other 'users' that are not the host user), and then send that document in a way that the receiver would prefer (email, file transfer, FTP, etc.). It means high level control of a machine instead of low level control, it does not mean that the computer is sentient, or makes 'moral' decisions, or any decisions outside of the realm of the task at hand.
It also means that the computer can be better self-aware. If you are doing a specific activity the computer will be better able to cache resources for said activity and predict the user's intent. It does not activate it until the command it given, but it would be a better end-user expierence for your computer to act one way during the work day, and annother way during hours that you would normally have as recreation. You could still do activities that the computer is not expecting, but at least it would be more effective at guessing your needs more of the time than not guessing at all.