Intel confirms Microsoft's Copilot AI will soon run locally on PCs, next-gen AI PCs require 40 TOPS of NPU performance

AI
(Image credit: Intel)

We've previously reported on industry rumors that Microsoft's Copilot AI service will soon run locally on PCs instead of in the cloud and that Microsoft would impose a requirement for 40 TOPS of performance on the Neural Processing Unit (NPU), but we had been unable to get an on-the-record verification of those rumors. That changed today at Intel's AI Summit in Taipei, where Intel executives, in a question-and-answer session with Tom's Hardware, said that Copilot elements will soon run locally on PCs. Company representatives also mentioned a 40 TOPS requirement for NPUs on next-gen AI PCs.

Microsoft has been largely silent about its plans for AI PCs and even allowed Intel to officially announce Microsoft's new definition of an AI PC. Microsoft’s and Intel’s new co-developed definition states that an AI PC will have an NPU, CPU, GPU, Microsoft’s Copilot, and a physical Copilot key directly on the keyboard. 

PCs meeting those requirements are already shipping, but that is just the first wave of the AI PC initiative. Intel divulged future AI PC requirements in response to my questions about potential memory criteria. 

"But to your point, there's going to be a continuum or an evolution, where then we're going to go to the next-gen AI PC with a 40 TOPS requirement in the NPU," said Todd Lewellen, the Vice President of Intel's Client Computing Group. "We have our next-gen product that's coming that will be in that category." 

"[..]And as we go to that next gen, it's just going to enable us to run more things locally, just like they will run Copilot with more elements of Copilot running locally on the client. That may not mean that everything in Copilot is running local, but you'll get a lot of key capabilities that will show up running on the NPU."

Currently, Copilot computation occurs in the cloud, but executing the workload locally will provide latency, performance, and privacy benefits. Notably, Intel's shipping Meteor Lake NPU offers up to 10  TOPS for the NPU, while AMD's competing Ryzen Hawk Point platform has an NPU with 16 TOPS, both of which fall shy of the 40 TOPS requirement. Qualcomm will have its oft-delayed X Elite chips with 45 TOPS of performance in the market later this year. 

Lewellen explained that Microsoft is focused on the customer experience with the new platforms. As such, Microsoft insists that Copilot runs on the NPU instead of the GPU to minimize the impact on battery life. 

"We had a lot of discussions over the course of the last year[with Microsoft], and we asked, 'Why can't we just run it on the GPU?' They said they want to make sure that the GPU and the CPU are freed up to do all this other work. But also, we want to make sure it's a great battery life experience. If we started running Copilot and some of those workloads on the GPU, suddenly you're going to see a huge hit on the battery life side," Lewellen explained. 

While AMD holds a slight lead in NPU TOPS performance, and Qualcomm claims a much bigger advantage in its not-yet-shipped chips, Intel says its roadmap includes next-gen processors to address every segment of the AI market.

"We have our product roadmap and plan on where we're at in mobile with premium and mainstream, but then you also go down into entry. And so we have plans around entry. From a desktop standpoint, we have plans on the desktop side, what we would say [is an] AI PC. And then there's also the next-gen AI PC, the 40 TOPS requirements; we have all of our different steps in our roadmap on how we cover all the different segments."

Intel's Lunar Lake processors will come to market later this year with three times more AI performance on both the GPU and the NPU than the existing Meteor Lake chips. Intel is already sampling those chips to its partners in preparation for a launch later in the year. Those chips will face off with Qualcomm's X Elite and AMD's next-gen processors. 

In the meantime, Intel is working to expand the number of AI features available on its silicon. As we covered in depth earlier today, the company plans to support 300 new AI-enabled features on its Meteor Lake processors this year. 

Many of those features will be optimized specifically for Intel's silicon. The company told me that roughly 65% of the developers it engages with use Intel's OpenVino, which means those applications are optimized specifically for Intel's NPU. The remainder of the developers use a 'mix of splits' between ONNX, DirectML, and WebNN, and Intel says it is happy to work with developers using any framework. 

However, the work with OpenVino could provide Intel with a nice runway of Intel-specific AI features as it heads into the Lunar Lake generation. Those are the types of advantages the company is obviously looking to enable through its AI PC Accelerator Program. The company says it has seen plenty of enthusiasm from the developer community, particularly in light of Intel's stated goal of selling 100 million AI PCs by 2025, which represents a big market opportunity for new AI software.

However, Microsoft's Copilot will run on NPUs from all vendors through DirectML, and having more TOPS will undoubtedly result in higher performance. That means we can expect a TOPS war to unfold, both in silicon and in marketing, over the coming years. 

Update 3/27/2024 6:50am PT: corrected Intel Meteor Lake and Ryzen Hawk Point NPU TOPS specifications. 

TOPICS
Paul Alcorn
Managing Editor: News and Emerging Tech

Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.

  • hotaru251
    tbh I am not looking forward to "ai" being requirement for future pc's. (let alone its highly likely just going to make data farming for them easier and a lot harder for users to disable)

    They should of been optional features via addon cards if anything. Let those who want them have them but not push em onto everyone.
    Reply
  • ezst036
    hotaru251 said:
    tbh I am not looking forward to "ai" being requirement for future pc's. (let alone its highly likely just going to make data farming for them easier and a lot harder for users to disable)

    They should of been optional features via addon cards if anything. Let those who want them have them but not push em onto everyone.
    This is likely running into a "TPM 3.0" moment on the one side, as Microsoft has its long history of bad faith practice because their customers let them get away with it.

    On the other side, it's a bad lesson that Microsoft has learned from Google that is urging them to go even further and include data mining and adware elements right into the OS itself. We are only at the beginning here. Over 70% of users(Android's phone marketshare) are just fine with all the spyware and data telemetry that's in Android.

    So if Google does it and their customers stay, why can't Microsoft put spyware and the equivalent data telemetry into Windows? Of course they can. They know they too have a customer base thick with apathy. Might as well cash in on it. They aren't going anywhere.

    And when Microsoft does switch Windows to a full time subscription model? Their users will howl to the moon about it, but:

    They will not move. Past history is the future predictor.

    $$$They $ will $ pay.$$$ And they'll enjoy paying the subscription. And they'll make excuses as to why now it's better that they pay monthly.
    Reply
  • josmat
    I'm already planning my migration to Linux... Definitely not interested in having AI in my computer, since I'm perfectly satisfied with my own intelligence to do the things I want to do with my computer.
    Reply
  • peachpuff
    josmat said:
    I'm already planning my migration to Linux... Definitely not interested in having AI in my computer, since I'm perfectly satisfied with my own intelligence to do the things I want to do with my computer.
    Linux desktop finally taking over in 2024...
    Reply
  • BX4096
    josmat said:
    I'm already planning my migration to Linux... Definitely not interested in having AI in my computer, since I'm perfectly satisfied with my own intelligence to do the things I want to do with my computer.
    Not really AI and has very little to do with intelligence. Since it's going to run locally, it's best to think of it as advanced automation and assistance software. I'm a power user and can see plenty of uses for it...as long as I don't have to send any of my queries to Microsoft.
    Reply
  • Math Geek
    josmat said:
    I'm already planning my migration to Linux... Definitely not interested in having AI in my computer, since I'm perfectly satisfied with my own intelligence to do the things I want to do with my computer.

    already done it. win 10 being based on data mining was the end for me. a nice secure sandboxed locked down vm lets me play the couple games on windows i want to play and everything else is easy to do in linux. a world without MS and google on my pc is a good one.
    Reply
  • Giroro
    Clippy ran locally, everybody hated it.
    Cortana ran locally, everybody hated it.

    .... Microsoft really needs to get rid of whoever has been pushing this same stupid idea for the last 30 years.

    Also, doesn't it take, like, "all the RAM you have" to run a LLM locally?
    Reply
  • dimar
    I think copilot key is stupid.
    Reply
  • jasonf2
    The magic of local integration and processing is more about cost of operation than anything else. As long as cloud services are necessary for copilot to operate Microsoft isn't just making license revenue but having to stand the dime for the cloud resources (servers, electricity and infrastructure) which is substantial. Once the PC itself localizes the software the end user is then paying for the hardware and electricity and Microsoft is back in the infinitely scalable software licensing model. The necessary cloud infrastructure for OpenAI's current models to work today still makes profitability questionable. When they can pass the cost of operation back to the end user Copilot becomes a money making machine via licensing. That is if anyone is really paying for it. As mentioned by others uptake of Microsoft's prior virtual assistants has been mediocre at best. Corporate uptake, which is where the real licensing money for business software is, will be interesting with the potential privacy issues of a possible rouge AI having access to all of your corporate data and the ability to actually do stuff with it. Current models don't exactly have a stellar track record with "hallucinations" of grandeur and world domination. With that much being said as well I don't really know how great I feel about full OS integration on my home computer.
    Reply
  • DavidLejdar
    In regard to mentioned concerns about data farming, I don't think that Microsoft will necessarily push it much. Companies like Alphabet or Meta, they seem to rely a lot on B2C (business to customer) revenue. Meanwhile, Microsoft had e.g. in 2022 only 12% from Windows, 8% from gaming, and 6% from search advertising. While 23% of revenue came from office products and services, and 34% came from server products and cloud services.

    In other words, Microsoft is very busy with B2B. Like, servers of a number of companies, they run on Windows, Office is heavily used in many an office, with licensing for it, and so on. And for Microsoft, there is way more to be gained by being like: "And how may we help you further?", than by being like: "We want all that data, even though at least in Europe, we are not really allowed to have it if there is not a specific consent to use of that data." - and at least here in Europe, many a company lawyer would not likely agree to any licensing agreement, which would make the company liable if they were to give Microsoft i.e. personal data of the company's customer (without consent thereto by these customers).

    And in that context, when a lot of the "new Office suit" can be run within a local network, or even on a laptop itself, then that actually helps to limit the data spread, when the user is going to use some of the new features anyhow.

    (Disclaimer: The company I work at, it has some collaboration with Microsoft going. So, that may make me biased, perhaps. But, just meant to point out, that Microsoft has a bit more going on, than to be interested in some: "We are going to sell an analysis of your most watched cat pictures to an advertiser, who then will have the ads AI-tailored to sell you whatever with a pic of cat.")
    Reply