Skip to main content

Intel's Xe-HPG GPU Uncovered: 5 Models With up to 512 EUs, 16GB

Intel
(Image credit: Intel)

We already know from an inadvertent leak by Intel itself that the company is preparing five notebook GPU models based on its Xe-HPG architecture. Still, the company's plans for desktop PCs were not clear at all. This week, Igor'sLab attempted to fill in some gaps with information obtained from a slide that's purportedly from an Intel DG2 presentation. 

Up to 512 EUs

If the unofficial information is to be believed, Intel is readying five discrete desktop GPU SKUs with similar configurations as their notebook counterparts. The top-of-the-range gaming SKU1 is said to feature 512 execution units (EUs), 16GB of GDDR6 memory with a 256-bit interface, and a 275W TDP. Other gaming parts are the SKU2 with 384 EUs and 12GB of GDDR6 memory with a 192-bit bus and the SKU3 with 256 EUs and 8GB of memory featuring a 128-bit interface. All gaming GPUs are expected to come in a 43×37.5 mm BGA2660 package. 

Intel is also preparing low-end discrete parts (SKU4 and SKU5) with 128 or 96 EUs, 4GB of RAM, and a 64-bit memory bus. These lower-end models will come in a 29×29 mm BGA1379 package and will be aimed mostly at notebooks. Yet, some low-end desktops can use these GPUs, too. 

Intel's Xe-HPG architecture is expected to inherit energy-efficient blocks from the Xe-LP architecture, clock speed optimizations designed for Xe-HP/Xe-HPC GPUs for data centers and supercomputers, high-speed internal interfaces, hardware-accelerated ray-tracing support, and a GDDR6-powered memory subsystem. Overall, the Xe-HPG will resemble Intel's existing GPUs to some degree but will run faster and support additional capabilities. Intel's Xe-HPG GPUs are set to be produced by TSMC. 

Launching in 2022?

Given that the Xe-HPG is set to inherit relatively small Xe-LP's EUs, it is surprising that Intel's top-of-the-range desktop configuration for DG2 only features 512 EUs. Intel will also reportedly launch its lower-end DG2 GPUs later in 2021 with higher-end gaming SKUs expected to be available only in early 2022, which was also unexpected. This information looks odd as recently the company started an Xe-HPG marketing campaign.

Intel of course does not comment on unreleased products, so everything is unofficial and should be taken with a grain of salt.  

  • zodiacfml
    tough luck, could have been great mining GPUs
    Reply
  • PCWarrior
    Direct performance comparisons should not be made with the igpu versions as these don’t have dedicated VRAM neither do they have the same VRAM size or speed. The 1660 super and the 1660 had the exact same specs except from VRAM speed (GDDR5 vs GDDR6). The super version was 15-20% faster. And the GT 1030 GDDR5 vs DDR4 had 100-120% difference. Factor in drivers, optimizations, etc, and we really are in the dark about how well these perform.

    In any case here is a fun exercise. Compared to Tigerlake we have:
    (i) 5.33x the EUs (512 vs 96)
    (ii) likely 1.33x higher frequency (1800MHz vs 1350MHz)
    (iii) likely 2.1x higher “IPC” due to the various VRAM differences.
    So, in total we have something around 15x faster than the iGPU on Tigerlake.

    Now the 3090 is ~20x faster than the Iris Plus G7. It follows that the 3090 is 33.3% faster than DG2. The 3090 is also ~1.5x faster than the 3070. Hence, DG2 is 12.5% faster than the 3070.

    I would say that at worst I expect DG2 to perform about the same as an RTX 3070/RTX 2080Ti and at best as a 3070Ti. Not bad but it needs to come out this year. In Autumn 2022 we expect to have the 4000 series and significant performance improvements, so this tier of performance will probably be relegated to the xx60-xx60Ti tier for $330-$400 MSRP. In any case with the current situation of gpus pricing and availability due to mining, scalping and chip shortage, MSRP goes out of the window anyway and any gpu available at a sane price is a winner.
    Reply
  • Chung Leong
    I wonder if Intel would release DG2 cards into the consumer market. Launching a new line of products costs a lot of money. And no guarantee of success. Selling the hardware to one of the streaming services (Stadia or Luna) would be the safer option. That's a controlled environment. Game compatibility is less of an issue.
    Reply
  • spongiemaster
    PCWarrior said:
    Direct performance comparisons should not be made with the igpu versions as these don’t have dedicated VRAM neither do they have the same VRAM size or speed. The 1660 super and the 1660 had the exact same specs except from VRAM speed (GDDR5 vs GDDR6). The super version was 15-20% faster. And the GT 1030 GDDR5 vs DDR4 had 100-120% difference. Factor in drivers, optimizations, etc, and we really are in the dark about how well these perform.

    In any case here is a fun exercise. Compared to Tigerlake we have:
    (i) 5.33x the EUs (512 vs 96)
    (ii) likely 1.33x higher frequency (1800MHz vs 1350MHz)
    (iii) likely 2.1x higher “IPC” due to the various VRAM differences.
    So, in total we have something around 15x faster than the iGPU on Tigerlake.

    Now the 3090 is ~20x faster than the Iris Plus G7. It follows that the 3090 is 33.3% faster than DG2. The 3090 is also ~1.5x faster than the 3070. Hence, DG2 is 12.5% faster than the 3070.

    I would say that at worst I expect DG2 to perform about the same as an RTX 3070/RTX 2080Ti and at best as a 3070Ti. Not bad but it needs to come out this year. In Autumn 2022 we expect to have the 4000 series and significant performance improvements, so this tier of performance will probably be relegated to the xx60-xx60Ti tier for $330-$400 MSRP. In any case with the current situation of gpus pricing and availability due to mining, scalping and chip shortage, MSRP goes out of the window anyway and any gpu available at a sane price is a winner.
    The iGPU is based on DG1 while the high performing discreet are based on DG2. Are the two comparable? At the very least, DG2 is supposed to support hardware ray tracing while the DG1 based GPU's don't. One would think there is more that differentiate the two to necessitate different codenames.
    Reply