Intel DNNL 1.2 Library Hints at Int8 Data Type Support in Xe GPUs

(Image credit: Twitter)

Intel has released version 1.2 of DNNL, the company’s optimized library for high-performance deep learning operations on CPUs and its GPUs. The new version introduces support for int8 on GPUs, which indicates that this will be a standard feature of the Xe architecture.

Phoronix has reported on the release of DNNL 1.2, previously called MKL-DNN: the variant of the company’s math kernel library (MKL) for deep neural networks (DNN). New with this release is support for int8 on GPUs, according to the site (we could not find the patch notes).

The current DNNL guide on data types still says that int8 is not supports on GPUs, with following noteworthy (general) explanation:

“Considering that performance is the main purpose of the low precision data types support, DNNL implements this functionality only for the platforms that have hardware acceleration for these data types.”

In other words, Intel did not previously support int8 on GPUs because its integrated graphics does not natively support the data type so far. This hence provides evidence that Intel is preparing int8 support in anticipation of its upcoming Xe architecture.

Intel has disclosed that Ponte Vecchio, which uses the Xe HPC architecture, has a matrix engine (similar to Nvidia's tensor cores) with 32x int8 acceleration. Intel adding int8 support now already therefor provides a strong indication that DG1’s Xe LP architecture also has int8 support already, even if it may not have the full-blown matrix engine from the Xe HPC architecture.

Among general optimizations, DNNL 1.2 includes supports for AVX-512 and Cascade Lake’s DLBoost, which improved performance of int8 operations (as those are often used in deep learning inference applications). In DNNL 1.2, int8 performance on pre-AVX-512 hardware and for 3D spatial data has been improved. Performance for 1D backward convolutions is also increased, as well as introducing a variety of primitives.

Intel at CES: DLBoost in Tiger Lake

In hindsight, evidence of int8 support was likely already provided at CES, as the company seems to be planning to bring DLBoost to its integrated graphics. 

During Intel's keynote, Lisa Pierce said: "So gaming is just a start. We also have advancements in media and display capabilities as well as AI improvements built in."

She then gave a demo of an application using Intel's OpenVINO toolkit for AI, and Intel's vice president of the PC Client Group, Gregory Bryant, remarked: "That's all done using AI acceleration on the GPU which is great."

  • jkflipflop98
    Xe has the horsepower to really upend the GPU market. Hopefully management doesn't screw it up by releasing the mid- and low-range cards first. But that's most likely what they'll do.
    Reply
  • hannibal
    How would that be screw up? Most people buy low end cards, some people buy middle range gards, very minor buy highend GPUs...
    what we really should be worried is... Are Intel GPUs cheap enough to be competative.
    Reply
  • neojack
    @hannibal being a flagship would be very beneficial to them, since they enter the market late.

    being the "cheap" alternative wouldn't do XE any good.
    if intel do this, XE would be seen as the "cheap" alternative to radeons. wich are seen by most as the "cheap" alternative to geforce cards
    Reply
  • Chung Leong
    neojack said:
    @hannibal being a flagship would be very beneficial to them, since they enter the market late.

    Customers aren't going to spend top dollar on unproven technology. Being forced to sell at massive discounts is not beneficial. It would be seen as evidence of failure.
    Reply
  • bit_user
    Chung Leong said:
    Being forced to sell at massive discounts is not beneficial. It would be seen as evidence of failure.
    This is sound wisdom.

    Starting towards the lower-end is a lower-stakes proposition, for Intel. It gives them some breathing room and lets them adjust pricing until the product slots into a competitive segment.

    That's harder to do with a big GPU, and I think probably nearly sunk AMD's Vega. Without the crypto-boom, Vega might've even been a financial failure (some early analysis by another website showed they probably lost money on each Vega 56 card, at launch).

    Also, there's the reputational damage of launching a supposed high-end card that struggles to keep up with the mid-range.
    Reply