The Intel Arc Alchemist architecture brought a third competitor to the best graphics cards, and while it can't top the GPU benchmarks in terms of performance, there's certainly a strong value proposition. At the same time, there are areas where the drivers still need tuning — Minecraft with ray tracing being one example that comes immediately to mind since it's one of the games in our standard test suite. But Intel hopes to encourage further adoption with the latest announcement of a $40 price cut to the Intel Arc A750, bringing it down to just $250.
It's difficult to say exactly how many people have purchased Intel Arc graphics cards — mobile or desktop — since they first became available in the spring of 2023. I've seen claims that Intel dedicated GPUs accounted for up to 4% of total sales in Q4 2022, but so far Arc GPUs don't show up as individual entries on the Steam Hardware Survey (opens in new tab) (which will be updated with January data shortly, so maybe that will change). Certainly, dropping the price of the A750 by 14% can't hurt.
It's not just about lowering the price, either. Since launching with the 3490 drivers back in October 2022, Intel has delivered three WHQL drivers and at least four beta drivers. The latest beta, version 4090, became available last week, and improving performance and compatibility has been a key target for all of the driver updates.
DirectX 9 performance, an area that Intel hadn't really focused on prior to the Arc launch, has been one of the biggest benefactors of the newer drivers. Intel claims that, across a test suite of thirteen games, average framerates at 1080p have improved by 43%, and 99th percentile fps has improved by 60%. At 1440p, the average fps increased by 35% while 99th percentile fps improved by 52%.
Granted, the test suite for DX9 games isn't so much about making games that ran poorly suddenly run well. The worst performing of the suite, Stellaris, looks to have performed at about 75 fps with the launch drivers, whereas it's now getting more like 130 fps. And Half-Life 2 went from just under 400 fps to about 600 fps. Even so, the overall experience has improved, and framerate consistency and frame times are much more stable.
It's also interesting that Intel continues to show the Arc A750 as an RTX 3060 competitor, mostly ignoring (in charts) AMD's own RX 6600. That's probably because AMD has a much stronger value proposition, with the RX 6600 regularly selling for $225 (opens in new tab), give or take. Our testing puts it slightly below the A750 (and RTX 3060), but the price cut does make up for the higher power use on Intel's GPU.
Intel is also continuing to push its XeSS (Xe Super Sampling) AI upscaling algorithm as an alternative to Nvidia's DLSS and AMD's FSR technologies. The adoption rate isn't nearly as high, but considering how new Intel is to the dedicated GPU arena, getting 35 games to support XeSS in the first six months or so is pretty decent.
Another feature of the Arc GPUs that's more than just "pretty decent" is the video encoding and decoding support. Arc was the first modern GPU to offer full AV1 support, and the quality of the Quick Sync Video encoding goes head to head with Nvidia's best (with AMD trailing on previous generation GPUs, though we still need to look at the latest RDNA 3 chips).
But it's not all sunshine and flowers. Our own testing of Bright Memory Infinite (using the Bright Memory Infinite benchmark on Steam) and Minecraft shows there's still room for improvement. Another interesting aspect of the Arc GPUs that we've discovered is that, using a Samsung Odyssey Neo G8 32-inch monitor, the DisplayPort connection can only run at up to 4K and 120 Hz, while Nvidia's RTX 20-series and later (using DP1.4a) all support 4K and 240 Hz via Display Stream Compression.
Ultimately, lowering the price of the A750 by $40 probably won't change the minds of millions of gamers, but it does make the overall package more attractive. Intel has also added Nightingale and The Settlers: New Allies to the software bundle for anyone who purchases a new Arc graphics card or system equipped with an Arc GPU. As we've noted before, Intel may not have the fastest cards on the planet, but the value proposition is certainly worth considering.
Which does bring up an interesting question: What's happened with the Arc A580? That's supposed to have the same 8GB of GDDR6 as the A750, but with 24 Xe-cores instead of 28 Xe-cores (3,072 shaders vs. 3,584 shaders on the A750). It also has a lower TBP of 175W compared to 225W and a Game clock of 1700 MHz, or at least that's the theory. With the new price on the A750, the space for an A580 continues to shrink, but maybe Intel could still release something in the $199–$219 range. We're still waiting...
The full slide deck from Intel is included below, for reference.
I doubt the desktop A5xx would be economically viable without a DG2-256 refresh to save ~150sqmm per die and almost double yield per wafer.
People wanting a worthwhile sub-$200 current-gen GPU upgrade like me are likely far more numerous than AMD, Nvidia and Intel would like, just as the Steam survey suggests. I'm not surprised at all that Intel's A380 appears to be the most popular option. If my GTX1050 decided to blow up today, I would probably get an A380 too unless my friend who still does crypto-mining in winter for heating offers me a really good price for one of his GPUs.
I hope Intel throws serious money and effort at their GPU products. The rewards could be quite respectable.
Anyway, good job driver team, those are massive improvements.
For example, Intel is touting a major driver bottleneck breakthrough in its newest update with its presentation graphs showing massive improvements in frame time consistency with variance reduced by 50+% across most of the board. That can be expected to have substantial repercussion across all titles that hit those bottlenecks and that may even include titles running legacy API translation.