Sign in with
Sign up | Sign in

The Next Big Feature For Intel CPUs is Cognitive Recognition

By - Source: PC World | B 34 comments

As transistor counts grow, the CPU will do more and understand what you want and need.

In an interesting interview with Intel's Paul Otellini over at PC World, which does not unveil any big secrets, the points out Intel's view of ARM and its idea of the future of computing, as well as thoughts what could be done with the rapidly increasing transistor count of processors.

Of course, as CPU dies shrink, we have come to expect lower power consumption, and greater performance overall to result in better efficiency. That has been the case since the departure from Netburst in 2006, but we have also seen GPU features to be integrated on the die and there is even more budget in the future to add new capabilities.

The feature Otellini stressed was cognitive recognition, which is a capability for the CPU to figure out - in context of all available data and your history of using the computing - what your current needs and wants may be and figure it out before you do. It is not a new idea: we have first heard about it at IDF Fall in 2005, where Intel explained the concept of user aware computing in its future technology keynote. Back then, the technology was forecast to become available within five to ten years and it is clearly worth noting that the topic is coming up again, even if it is under a different name.

The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve.


Contact Us for News Tips, Corrections and Feedback

Discuss
Display all 34 comments.
This thread is closed for comments
Top Comments
  • 22 Hide
    master_chen , September 16, 2012 1:06 PM
    Quote:
    The CPU will understand what you want and need.

    OPEN THE BAY DOORS, HAAAL!!!!
  • 17 Hide
    freggo , September 16, 2012 2:12 PM
    master_chenOPEN THE BAY DOORS, HAAAL!!!!


    I'm sorry, master_chen. I'm afraid I can't do that.
  • 16 Hide
    Prescott_666 , September 16, 2012 1:13 PM
    Why would you build this function into silicon? This is the kind of feature that should be implemented in software.

    I don't see the point.
Other Comments
  • 22 Hide
    master_chen , September 16, 2012 1:06 PM
    Quote:
    The CPU will understand what you want and need.

    OPEN THE BAY DOORS, HAAAL!!!!
  • 16 Hide
    Prescott_666 , September 16, 2012 1:13 PM
    Why would you build this function into silicon? This is the kind of feature that should be implemented in software.

    I don't see the point.
  • 13 Hide
    master_chen , September 16, 2012 1:19 PM
    Quote:
    Why would you build this function into silicon? This is the kind of feature that should be implemented in software.

    I don't see the point.


    Processors and chips have instructions sets and firmwares, which are basically a kind of software too (just build-in).
  • -8 Hide
    beayn , September 16, 2012 1:28 PM
    "The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve. "

    While I hate to admit it, I think he's wrong here. Far too many people have allowed the smartphone to be their centerpiece of computing and their lives. It won't ever be that way for me, and people will still have PCs and such, but they are living off their phones today. That will only get worse.
  • 10 Hide
    pat , September 16, 2012 1:33 PM
    Prescott_666Why would you build this function into silicon? This is the kind of feature that should be implemented in software.I don't see the point.


    Correction..

    Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the point

    I hope I'll never need a machine to tell me what to do next..

  • 13 Hide
    nightbird321 , September 16, 2012 1:54 PM
    I see this more as a power saver, the CPU can tell what processing capabilities the current user tasks need and power down everything else. No more music streaming keep all the cores and caches powered up.
  • 1 Hide
    doron , September 16, 2012 2:08 PM
    beayn"The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve. "While I hate to admit it, I think he's wrong here. Far too many people have allowed the smartphone to be their centerpiece of computing and their lives. It won't ever be that way for me, and people will still have PCs and such, but they are living off their phones today. That will only get worse.


    What I think he meant is that most of the more advanced computing capabilities will not be processed in smartphones. Even if smartphones will be the "centerpiece of computing" for most people and PCs will eventually die, most of the number-crunching will be done in servers and will be transferred to your device of choice. This will become more apparent as internet bandwidth and speed will grow, but you can already see some services offering advanced capabilities such as Dropbox and Onlive
  • 17 Hide
    freggo , September 16, 2012 2:12 PM
    master_chenOPEN THE BAY DOORS, HAAAL!!!!


    I'm sorry, master_chen. I'm afraid I can't do that.
  • 1 Hide
    freggo , September 16, 2012 2:14 PM
    patCorrection..Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the pointI hope I'll never need a machine to tell me what to do next..


    ...now hit "submit" to continue or F1 if you need help :-)

  • -3 Hide
    A Bad Day , September 16, 2012 2:19 PM
    freggo...now hit "submit" to continue or F1 if you need help :-)


    But your keyboard is not recognized by the computer. :-)
  • 5 Hide
    demonhorde665 , September 16, 2012 3:12 PM
    patCorrection..Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the pointI hope I'll never need a machine to tell me what to do next..

    my thoughts exactly

    skynet , silicon messiah , matrix and I Robt are a few things that come to mind when disccsuing a chip recognizing any thing past some numbers i throw at it. certainly don't need my computer deciding it's time to play solitaire just because my mom usually plays a game of solitare every day in the morning on my pc.
  • 0 Hide
    bigdog44 , September 16, 2012 3:23 PM
    I agree with 'pat' above. We operate on at least two different levels, and this would make it difficult for most to operate on more than one level even when it is necessary to do otherwise. With regards to how it would be implemented; if this was an offshoot of memristors, then i can see it possibly being done completely in hardware, mimicking how the human brain habituates.
  • 5 Hide
    memadmax , September 16, 2012 3:28 PM
    It shouldn't be "what the user wants next" it should be "what the software wants next" The cpu doesn't, and will never know what the user will want next because it is very, very, loosely coupled to the user. Instead, software prediction(branch prediction, on the fly loop optimization, etc etc) and doing that better is a better solution and has been worked on for several years now...

    In the end, I think this guy paul is just bored and shooting crap out his rear end because humans are extremely unpredictable from a computer stand point...
  • 3 Hide
    Anonymous , September 16, 2012 3:30 PM
    There will never be enough transistors to implement something like this at the silicon level. This would be implemented at the OS level at the lowest, but likely at the application level. The most Intel could possibly do is add more cores to support doing this without slowing down the rest of the machine. They could also profile the workload and create new SSE/AVX instructions, but the workload won't be much different than existing database workloads, so there's not much to be done there.

    Intel: Taking credit for software developer's work. Well done.
  • 1 Hide
    x3nophobe , September 16, 2012 3:42 PM
    "Why would you build this function into silicon? This is the kind of feature that should be implemented in software.

    I don't see the point."

    Placing this function into the CPU makes it (the function) OS neutral.
  • 4 Hide
    InvalidError , September 16, 2012 4:16 PM
    x3nophobePlacing this function into the CPU makes it (the function) OS neutral.

    Not really since in order to make any predictions about what the user wants to do next, the CPU would need to receive information from input devices and other stuff managed by the OS and the CPU would also need ties into the OS to act upon its predictions since it would otherwise have absolutely no cue about whether it is processing a keyboard press, mouse movement/click, on-screen button, etc. nor have any clue about how to act on any of those.

    Considering how complex the data-hoarding/mining algorithms may be and how intimately the prediction algorithms may need to interact with software, this sort of feature is fundamentally impossible to implement in hardware. At best, they may implement new instructions, multi-threaded and GPGPU libraries to optimize those types of algorithms at the OS and application levels.
  • 0 Hide
    fazers_on_stun , September 16, 2012 4:33 PM
    Quote:
    my thoughts exactly

    skynet , silicon messiah , matrix and I Robt are a few things that come to mind when disccsuing a chip recognizing any thing past some numbers i throw at it. certainly don't need my computer deciding it's time to play solitaire just because my mom usually plays a game of solitare every day in the morning on my pc.


    IMO what is meant by "cognitive" is that the computer will be better able to interpret ambiguous input or commands from the user, not tell the user what to do. Sorta like anticipating the user's needs, instructions, wants, etc. This could prove useful in the near future when voice/command recognition is more widely used.
  • 1 Hide
    CaedenV , September 16, 2012 4:53 PM
    I think a lot of people are missing the point. The idea is not that the computer DOES what you want ahead of it, the idea is that the computer becomes more aware of itself and the user's habits so that it can set aside resources appropriately, and better understand the commands from the user (whatever the input). This is not so important for dedicated input systems such as touch, keys, and mouse where there is a very ridged meaning behind each command, but it is absolutely necessary for better speech control, and 'combined commands'.

    The idea is that you could tell the computer to check a document for errors and send it to someone. The computer would be able to recognize that you mean spell and grammar check, export to PDF or some other document type that the other user can accept (because cognitive understanding can be implemented for other 'users' that are not the host user), and then send that document in a way that the receiver would prefer (email, file transfer, FTP, etc.). It means high level control of a machine instead of low level control, it does not mean that the computer is sentient, or makes 'moral' decisions, or any decisions outside of the realm of the task at hand.

    It also means that the computer can be better self-aware. If you are doing a specific activity the computer will be better able to cache resources for said activity and predict the user's intent. It does not activate it until the command it given, but it would be a better end-user expierence for your computer to act one way during the work day, and annother way during hours that you would normally have as recreation. You could still do activities that the computer is not expecting, but at least it would be more effective at guessing your needs more of the time than not guessing at all.
  • 1 Hide
    Anonymous , September 16, 2012 6:05 PM
    Yes, Like the dates/version Of the laptops OEM's customized Intel HD graphics drivers are woefully out of date, initiate INTEL OEM warning to (laptop's OEM), Please update Your customized Intel HD graphics drivers, or face losing certification for your product!
  • 2 Hide
    kanoobie , September 16, 2012 6:05 PM
    "My CPU is a neural net processor; a learning computer."
Display more comments