Intel Arc A380 Reviewed, Specs of Full GPU Lineup Revealed

The Intel Arc logo
(Image credit: Intel)

A preliminary review of the Intel Arc Alchemist A380 entry-level graphics board has arrived via SiSoftware, but hidden inside this review is the bigger news — we now have the concrete specs of the full DG2 GPU family from a reliable source. SiSoftware's review of the entry-level Arc A380 is less than stellar, but the specs of the other cards hold out hope for Intel's discrete gaming GPUs for desktop PCs.

Swipe to scroll horizontally
Row 0 - Cell 0 A300 SeriesA500 SeriesA700 Series
Execution Units128384512
Shaders102430724096
Memory Size6GB12GB16GB
Memory Speed14 Gbps16 Gbps16 Gbps
Memory Bus Width96 Bit192 Bit256 Bit
API SupportOpenCL 3.0, DirectX 12 Ultimate supportOpenCL 3.0, DirectX 12 Ultimate supportOpenCL 3.0, DirectX 12 Ultimate support

SiSoftware has yet to review the other cards in the family, and it’s worth noting that the A380 review is an OpenCL review. In addition, the review was conducted under Windows 10, looking at things like hashing, cryptographic analysis, and image processing rather than measuring DirectX performance in gaming.

The A380 regularly comes in third place in SiSoftware’s tests, beaten by cards such as the GTX 1660Ti, and RX 6500 XT, putting it in roughly the same place as an earlier leak that placed the A380 neck-and-neck with the GTX 1650 Super, as well as one that saw it beaten by an RTX 3070 in OpenCL. Some of this falls at the door of the A380’s lack of support for FP64 double-precision floating point math, though it makes up for it with tensor cores used for matrix multiplication and lower precision work. We have yet to see how the A500 and A700 will compare to these cards, but we can be sure that they will beat the entry-level A380.

The full specs of the A380 card tested are 128 compute units capable of processing 1,024 threads and running at 2.45 GHz, 16 tensor cores, 6GB of GDDR6 with a bandwidth of 192 GB/s, 32 render output units, 64 texture mappers, and a power draw of just 75W. The memory bandwidth, in particular, is poor compared to Nvidia cards - the 1660 Ti and 3050 manage 288 and 224 GB/s respectively - while the power draw is low compared to the two Nvidia cards’ pull of 120 and 130W.

Transfer of data across the PCIe bus also appears slow, with the A380 managing a download figure of 3.06GB/s and an upload of 2.88GB/s, while the Nvidia cards’ figures are much closer to 12GB/s in both directions. 

This is a low-end card, however, and the modest power draw may attract builders of small form-factor, low-powered systems who aren’t looking for remarkable hashing abilities as long as the price is competitive. Gamers looking for one of the best graphics cards will want to wait for the inevitable DirectX figures, where the double-precision FP64 operations the A380 doesn’t support are an optional feature, so may make less of an impact on the scores.

Ian Evenden
Freelance News Writer

Ian Evenden is a UK-based news writer for Tom’s Hardware US. He’ll write about anything, but stories about Raspberry Pi and DIY robots seem to find their way to him.

  • InvalidError
    If the A380 performs on par with the 1650S, then that makes it worse than the RX6500 when it isn't hobbled by its VRAM size, bandwidth and x4 PCIe. Wonder what this is going to translate to in street prices.
    Reply
  • digitalgriffin
    InvalidError said:
    If the A380 performs on par with the 1650S, then that makes it worse than the RX6500 when it isn't hobbled by its VRAM size, bandwidth and x4 PCIe. Wonder what this is going to translate to in street prices.

    It has to be <$200 to be viable. Intel is unproven in the field of long term support on hardware ventures. Larrabee/Knights Corner, Itanium, and XPoint come to mind. (Yes yes, I know Itanium had a long life considering, but Intel pretty much admitted defeat after adding x64 extensions by AMD. Any SERIOUS resource money stopped after x64 extensions were added to x86)
    Reply
  • InvalidError
    digitalgriffin said:
    It has to be <$200 to be viable.
    The RX6500's festival of cut corners retails for ~$270 while the A380 has 50% more VRAM bandwidth, 50% more VRAM, full encode/decode acceleration and most likely at least 4.0x8 PCIe. I'd say until the RX6500's effective retail price drops, there is plenty of room for the A380 above $200 since it lacks at least four of the RX6500's worst shortcomings.

    Given the choice between A380 and the RX6500 at $200, I'd probably pick the A380 mainly for the extra VRAM and not having to worry about 4.0x4 becoming a major bottleneck later. 50W lower TDP doesn't hurt either.
    Reply
  • digitalgriffin
    InvalidError said:
    The RX6500's festival of cut corners retails for ~$270 while the A380 has 50% more VRAM bandwidth, 50% more VRAM, full encode/decode acceleration and most likely at least 4.0x8 PCIe. I'd say until the RX6500's effective retail price drops, there is plenty of room for the A380 above $200 since it lacks at least four of the RX6500's worst shortcomings.

    Given the choice between A380 and the RX6500 at $200, I'd probably pick the A380 mainly for the extra VRAM and not having to worry about 4.0x4 becoming a major bottleneck later. 50W lower TDP doesn't hurt either.
    Well I'm a cheap bastard. I thought $650 was too much for a 6800XT. And to convince me Intel really needs to take a loss on first gen. Their long term support history is horrid. The 1660S had a MSRP of $229. Undercutting it isn't too much to ask for.

    Introducing GeForce GTX 1660 and 1650 SUPER GPUs, and New Gaming Features For All GeForce Gamers | GeForce News | NVIDIA
    Reply
  • hotaru.hino
    digitalgriffin said:
    It has to be <$200 to be viable. Intel is unproven in the field of long term support on hardware ventures. Larrabee/Knights Corner, Itanium, and XPoint come to mind. (Yes yes, I know Itanium had a long life considering, but Intel pretty much admitted defeat after adding x64 extensions by AMD. Any SERIOUS resource money stopped after x64 extensions were added to x86)
    Xeon Phi and Itanium were niche products anyway. One could argue Intel was late to the party regarding Phi since at that point, NVIDIA had ~3 year headstart in the GPGPU market. However, despite that you say that Intel gave up on Itanium the moment x64 was out, they still supported it until 2021. For a "dead on arrival" product line, that sure is a long support time.

    I don't think Intel threw in the towel with XPoint, considering it still performs much better than flash memory in terms of IOPS, which is a much more useful spec than raw bandwidth. But if it's going to die, it'll be because Intel doesn't want to share the technology.

    Intel still has a extensive NIC lineup. And if anything, Intel can "convince" system builders to use their GPUs anyway.
    Reply
  • InvalidError
    digitalgriffin said:
    Their long term support history is horrid. The 1660S had a MSRP of $229. Undercutting it isn't too much to ask for.
    It is a lot to ask for when GDDR6 currently costs about twice as much as it did back then, shipping costs 6-10X as much, most other input costs have gone up 10-20% and then you have to add the US import tariffs that didn't exist back then on top. If Nvidia wanted to re-launch the 1660S today, Nvidia would likely need to take a hit to its gross profit margin for those GPUs to hit MSRPs below $300.
    Reply
  • digitalgriffin
    hotaru.hino said:
    Xeon Phi and Itanium were niche products anyway. One could argue Intel was late to the party regarding Phi since at that point, NVIDIA had ~3 year headstart in the GPGPU market. However, despite that you say that Intel gave up on Itanium the moment x64 was out, they still supported it until 2021. For a "dead on arrival" product line, that sure is a long support time.

    I don't think Intel threw in the towel with XPoint, considering it still performs much better than flash memory in terms of IOPS, which is a much more useful spec than raw bandwidth. But if it's going to die, it'll be because Intel doesn't want to share the technology.

    Intel still has a extensive NIC lineup. And if anything, Intel can "convince" system builders to use their GPUs anyway.

    Itanium budgets disappeared after x64 extensions took off. The only thing keeping it afloat was like sun sparc systems and hp that had clients dependent upon it.
    Reply
  • digitalgriffin
    InvalidError said:
    It is a lot to ask for when GDDR6 currently costs about twice as much as it did back then, shipping costs 6-10X as much, most other input costs have gone up 10-20% and then you have to add the US import tariffs that didn't exist back then on top. If Nvidia wanted to re-launch the 1660S today, Nvidia would likely need to take a hit to its gross profit margin for those GPUs to hit MSRPs below $300.

    You're claiming prices went up over 50%. Sorry I'm not buying that. That's like the aib's claiming they had to raise the prices because aluminum went through the roof. They use like $3.00 in Aluminum on most heatsinks. That's not an exaggeration. Even if you doubled the price it would not justify $50 price hikes.
    Reply
  • InvalidError
    digitalgriffin said:
    You're claiming prices went up over 50%. Sorry I'm not buying that. That's like the aib's claiming they had to raise the prices because aluminum went through the roof. They use like $3.00 in Aluminum on most heatsinks. That's not an exaggeration. Even if you doubled the price it would not justify $50 price hikes.
    The doubling of VRAM prices adds ~$30, VRM component shortages adds ~$10, the sextupling of shipping costs adds ~$10, the 20-30% increase in raw material costs and machining costs for the HSF adds ~$5, then you have to add everyone else in the supply chain's increased sick days and other likely permanent post-pandemic costs adding 10-20%. Finally, you have the 25% import tax for GPUs made in China.

    The import tax alone is a $60 price increase for a $220 GPU before accounting for any of the cost increases.
    Reply
  • hotaru.hino
    digitalgriffin said:
    Itanium budgets disappeared after x64 extensions took off. The only thing keeping it afloat was like sun sparc systems and hp that had clients dependent upon it.
    Then who was making the Itanium chips after 2005?

    The whole product line had actual hardware refreshes until 2017. You don't create new hardware from an "afloat" budget.
    Reply