Intel Confirms New Meteor Lake ASIC for Big AI Performance Boost

According to a report by Phoronix, Intel quietly confirmed the existence of a new ASIC, the Versatile Processing Unit (VPU), via a new Linux driver posted yesterday. This new unit is designed to accelerate AI interference for deep learning applications and will arrive in 14th Gen Meteor Lake processors.

For the uninitiated, AI interference refers to using trained AI networks to make predictions and is a key part of all modern AI workflows. With the tech industry so heavily focused on AI algorithms, it only makes sense for Intel to put these "AI cores" into its chips to meet consumer and developer demand in workflows. You can think of this new unit as a similar alternative to Nvidia's Tensor cores.

This silent announcement comes six years after Intel acquired AI processing unit expert Movidius. Movidius' creation at the time was revolutionary; its VPUs could pack impressive performance per watt, with a heterogeneous package consisting of several additional processors designed for specific tasks. With this specialized architecture, Movidius' creation could pull off 4 TOPS in a 1.5W power envelope.

In layman's terms, this chip had a level of power efficiency other companies in the industry could only dream of at the time -- including Nvidia.

Without a doubt, this new chip is built at least partially by the employees behind Movidius. Unfortunately, we don't know how powerful this new Versatile Processing Unit will be in Intel's Meteor Lake processors. However, after five years of research and development, we expect Intel and Movidius' new ASIC to perform very well.

Thanks to the driver patch notes, all that we know currently is some of the chip's internals. Intel's VPU will include a memory management unit for translating VPU data to host DMA addresses and isolating user workloads, a RISC-based microcontroller, a Neural Compute Subsystem, and a Network on Chip.

Meteor Lake is still two CPU generations away, so it will be some time before Intel releases this product to market. But apparently, this chip will be available explicitly for client CPUs. As a result, we don't know if server chips will get this VPU (or perhaps a beefed-up variant), but this news at least guarantees we'll see this AI-focused unit on the consumer side of the market.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • spongiemaster
    This new unit is designed to accelerate AI interference for deep learning applications

    Are these units getting shipped to a new company called Cyberdyne?
    Reply
  • Exploding PSU
    spongiemaster said:
    Are these units getting shipped to a new company called Cyberdyne?

    "Disrupting the industry", literally
    Reply
  • Thunder64
    Who writes this crap? "Interference"? Never heard of any "AI interference".
    Reply
  • david germain
    is it meant to be 'AI Inference '? i tried looking it up and google keeps sending me to AI Inference not Interference.
    Reply
  • bit_user
    Without a doubt, this new chip is built at least partially by the employees behind Movidius.
    I wouldn't say "without a doubt". Intel also bought Nervana, Mobileye, and Habana Labs. And they have a reputation for duplication of effort, like how they created their failed Larrabee dGPU attempt completely independent of the chipset graphics group.

    It would be interesting to know if it is related to Movidius, because at least the first couple generations used programmable VLIW cores. I expect they have more similarities than differences with the programmable cores in Intel's iGPUs. Movidius' big advantage in perf/W came from careful utilization of on-die SRAM - something that's going to face continual cost pressure, when shipped as part of every CPU Intel sells.

    BTW, I'm disappointed with the article's title. It says "Intel Confirms..." leading me to believe they'd made some official comment about it. I already knew about the patch. Furthermore "...Big AI Performance Boost" suggests there were specific performance claims made, as well.
    Reply