Microsoft Snapdragon X Copilot+ PCs get local DeepSeek-R1 support — Intel, AMD in the works
Are we finally going to find use for those AI PCs?

Microsoft just announced that it will release NPU-optimized versions of DeepSeek-R1, allowing it to take advantage of AI-optimized hardware found in Copilot+ PCs. According to the Windows Blog, the feature will first arrive on Qualcomm Snapdragon X PCs, to be followed by Intel Core Ultra 200V (Lunar Lake) and other chips. The initial release will feature DeepSeek-R1-Distill-Qwen-1.5B, which an AI research team from UC Berkeley has discovered is the smallest model that delivers correct answers, but larger models featuring 7 billion and 14 billion parameters will arrive shortly thereafter.
DeepSeek’s optimizations meant that it needed 11x less compute versus its Western competitors, making it a great model to run on consumer devices. However, it also uses Windows Copilot Runtime so developers can use on-device DeepSeek APIs within their apps.
Furthermore, Microsoft claims that this NPU-optimized version of DeepSeek will deliver “very competitive time to first token and throughput rates, while minimally impacting battery life and consumption of PC resources.” This means that Copilot+ PC users can expect the power and performance of competing models like Meta’s Llama 3 and OpenAI’s o1 while ensuring that the devices it’s installed on still offer great battery life.
That said, DeepSeek’s availability on Copilot+ PCs is geared more toward programmers and developers instead of consumers. Perhaps Microsoft is using it to encourage them to build more apps that would take advantage of AI PCs as many people still don’t see the need for it and market research suggests users only purchase these devices because they’re the only available option nowadays.
Another thing that got us curious is Microsoft’s preferential treatment for Qualcomm Snapdragon X PCs at this time. While it launched the Copilot+ branding with these chips last July, the latest mainstream Intel and AMD laptops now also have built-in NPUs. AMD has even released instructions on how users can run it on Ryzen AI CPUs and Radeon GPUs, with the company even claiming that the RTX 7900 XTX runs DeepSeek better than the RTX 4090.
Whatever the case, we’re still excited about the possibilities that DeepSeek unlocks for AI. Since it’s open source, nearly anyone can download it and run it locally, allowing others to build upon the advancements and optimizations the original model has put into place.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.
-
hotaru251 Microsoft just announced that it will release NPU-optimized versions of DeepSeek-R1
....so they claim its stolen stuff from openai who its heavily invested in yet has no issue rushing it to its own products???
thats just hypocritical -
heffeque OpenAI stole from everyone, DeepSeek stole from OpenAI, and Microsoft seems to be OK with it.Reply
It's not as if Microsoft hasn't stolen anything for themselves too...
Both of these seem true in this case:
"It's no crime to steal from a thief"
"There is no honor amongst thieves" -
jlake3 Another thing that got us curious is Microsoft’s preferential treatment for Qualcomm Snapdragon X PCs at this time.
Microsoft really wants to make fetch Qualcomm laptops happen. -
DS426
Well, it's just an accusation at this time, but I can't really disagree with you.hotaru251 said:....so they claim its stolen stuff from openai who its heavily invested in yet has no issue rushing it to its own products???
thats just hypocritical
It's just funny to me how everyone is scrambling over this. Cooler minds will prevail, right??