Intel Reveals First Arc A750 Desktop GPU Benchmarks
Intel shows 13% performance lead over RTX 3060
The first Arc A750 desktop graphics card performance numbers were shown for Intel's upcoming GPU in a new video, showing how it stacked up against the GeForce RTX 3060 in a selection of five games. That's one of the best graphics cards, delivering mainstream gaming performance roughly in the middle of our GPU benchmarks hierarchy.
All the usual caveats apply for these Intel-provided benchmarks — the choice of games and settings were almost certainly done to paint the Arc A750 in the best light possible — but the results do show promise. We're still waiting for the official worldwide launch of the Intel Arc Alchemist GPUs, which should happen in the next few months.
Ryan Shrout, formerly of PC Perspective and now part of the Intel Graphics marketing team, takes us through the upcoming GPU and shows a short segment of the Cyberpunk 2077 benchmark sequence. All testing was performed at 2560x1440 and 'high' settings, a reasonable target for a mainstream GPU. You can see the results halfway through the video, but we've pulled Intel's numbers from the end to create the following table.
Game | Arc A750 | RTX 3060 | % Improvement |
---|---|---|---|
Five Game Geomean | 94.5 | 83.7 | 13% |
Borderlands 3 | 76.4 | 67.3 | 14% |
Control | 64.6 | 56.9 | 14% |
Cyberpunk 2077 | 59.9 | 52.5 | 14% |
F1 2021 | 192.0 | 164.0 | 17% |
Fortnite | 132.8 | 125.1 | 6% |
The GeForce RTX 3060 is no slouch when it comes to gaming performance. It can easily handle 1080p at maxed-out settings in most games, and 1440p at high quality should typically deliver 60 fps or more. All of that basically agrees with Intel's numbers, but the more important bit is how the Arc A750 stacks up.
Across the five games, Intel's GPU delivered 13% higher performance on average (geometric mean), with a lead ranging from 6% in Fortnite to as much as 17% in F1 2021. That's a pretty good showing for a graphics card that we hope will cost somewhere in the $300–$350 range.
Of course, these are Intel's own benchmarks in a handful of games. We know from independent testing of the Arc A380 desktop GPU that drivers for Arc are still developing, with performance that can vary greatly between games. While AMD and Nvidia have optimized their drivers for many games over the past couple of decades, Intel's integrated graphics drivers often felt like an afterthought. Things have improved, but there's still plenty of work to do.
None of Intel's benchmarks used ray tracing effects, despite Control, Cyberpunk 2077, and Fortnite supporting DXR (DirectX Raytracing). We'd really love to see some ray tracing benchmarks for Arc, if only to satisfy our own curiosity. The maximum number of RTUs (Ray Tracing Units) for Arc A-series graphics cards is 32, and the A750 likely has 24. The RTX 3060 only has 28 RT cores, but those are Nvidia's second-generation RT cores and we still don't know how those compare to Intel's RTUs.
Graphics Card | Intel Arc A770 | Intel Arc A750 | Intel Arc A550 | Intel Arc A380 |
---|---|---|---|---|
Architecture | ACM-G10 | ACM-G10 | ACM-G10 | ACM-G11 |
Process Technology | TSMC N6 | TSMC N6 | TSMC N6 | TSMC N6 |
Transistors (Billion) | 21.7 | 21.7 | 21.7 | 7.2 |
Die size (mm^2) | 406 | 406 | 406 | 157 |
Xe Cores | 32 | 24 | 16 | 8 |
GPU Shaders | 4096 | 3072 | 2048 | 1024 |
Matrix Cores | 512 | 384 | 256 | 128 |
RTUs | 32 | 24 | 16 | 8 |
Boost Clock (MHz) | 2000? | 2000? | 2000? | 2000? |
VRAM Speed (Gbps) | 16 | 16 | 16 | 15.5 |
VRAM (GB) | 16 | 12 | 8 | 6 |
VRAM Bus Width | 256 | 192 | 128 | 96 |
TFLOPS FP32 (Boost) | 16.4? | 12.3? | 8.2? | 4.1? |
TFLOPS FP16 (Tensor) | 131.1? | 98.3? | 65.5? | 32.8? |
Bandwidth (GBps) | 512 | 384 | 256 | 186 |
Intel hasn't detailed the official specs for all the Arc desktop GPUs it plans to launch, but we expect the A750 will have 24 Xe cores and 3072 GPU shaders, with 12GB of GDDR6 memory on a 192-bit bus, probably clocking in the 2.0–2.3GHz range. On paper, that should at least put it in the same performance realm as the RTX 3060, which would also put it into direct competition with the Radeon RX 6650 XT.
Assuming Intel can get performance in most games up to a similar level shown in this video, it could be a welcome addition to the desktop market if it comes priced accordingly. The RX 6650 XT outperformed the RTX 3060 by 15% at 1080p ultra in our GPU testing, and you can currently find the Radeon RX 6650 XT for $349.
That's effectively a price ceiling for the Arc A750 in our minds, at least if Intel wants gamers to show interest. The Arc A380, meanwhile, is supposed to have a $129–$139 MSRP when it arrives in the US, hopefully by next month. That shows Intel is willing to be aggressive on pricing, but the clock is definitely ticking with Nvidia Ada and AMD RDNA 3 slated to launch later this year.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
Roland Of Gilead This is getting interesting now! The entry level A380 seemed a little lackluster, but IMO, decent for what is a first discreet GPU on the low end from Intel (Cough..Larrabee!)Reply
This on the other hand is more impressive. It's a BIG if, that the comparisons were done with same details settings etc, but IF so, then this is a great first attempt mainstream SKU. Raja Koduri has brought some major culture change in Intel's GPU business. This is clear to see.
Hopefully this means more competition in the discrete GPU sector, and more FPS for us end users at decent prices, without the gouging that currently exists. -
JarredWaltonGPU
The comparisons were definitely done with the same hardware and settings. That's not the question. Intel says as much in the disclosure slide at the end of the video. The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.keith12 said:This is getting interesting now! The entry level A380 seemed a little lackluster, but IMO, decent for what is a first discreet GPU on the low end from Intel (Cough..Larrabee!)
This on the other hand is more impressive. It's a BIG if, that the comparisons were done with same details settings etc, but IF so, then this is a great first attempt mainstream SKU. Raja Koduri has brought some major culture change in Intel's GPU business. This is clear to see.
Hopefully this means more competition in the discrete GPU sector, and more FPS for us end users at decent prices, without the gouging that currently exists. -
cyrusfox
Driver development is a marathon not a sprint. Intel has quite a way to go. Even with all that time working on DG1 till now. Gamer's nexus video on A750 reveal did a good job talking about drivers -inifinite number of games, which do you choose to work on.JarredWaltonGPU said:I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel.
Arc on DX12 & Vulcan expect great performance, anything earlier... not so much, also REBAR needed for experience to be playable on most titles. -
Roland Of Gilead JarredWaltonGPU said:The comparisons were definitely done with the same hardware and settings. That's not the question. Intel says as much in the disclosure slide at the end of the video. The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.
At this point, I'd totally agree with you.
I guess the point I was trying to make was, that with Raja on board and Intel very obviously making a 'real go' at this GPU thing (;)), that there's more to come. I'd imagine he is taking the 'Intel' GPU driver issue seriously. It won't be good enough they keep to their current output of drivers. They will step up and I'm sure over the course of time, have driver certification on par and as regularly as AMD/Nvidia!
Hopefully, this is an early taste of what's to come with driver optimization and regular rollouts of 'Game ready' drivers.
Edit: @JarredWaltonGPU , in terms of general raster performance, would the performance of this card not roughly translate to the same/similar performance across the majority of games? (Incidental variances aside).
I realise that you can't be certain as each gam engine might respond differently, along with architecture differences. But, in general terms? -
-Fran- I think someone else put it best somewhere I can't remember: "it's about Intel just dives into the pool and tries to stay afloat and swim like crazy, otherwise AMD and nVidia will be on the way back by the time they feel ready".Reply
I think the best overall news is the tentative pricing of the A380. At ~$120 it's not a terrible deal. Could be better, for sure, but given the features and potential, certainly makes it an interesting proposition. AMD and nVidia won't have a low end card in like a year anyway, at best.
Regards. -
aetolouee Arc on DX12 & Vulcan expect great performance, anything earlier... not so much
Would be interesting to see some benchmarks on dx11 and older games turned into vulkan games with dxvk -
JarredWaltonGPU
I'm not sure why anyone would have a lot of faith in Raja. He was the man behind Vega, more or less, and while it wasn't bad it also wasn't great. But drivers are always a concern, with any GPU, and Intel just has a really poor track record. It's improving, but as recently as the Deathloop FSR 2.0 patch that game was having problems on all Intel GPUs. In theory, a driver just translates the high level DirectX calls into code that the GPU can run, but in practice I guess with infinite ways of accomplishing certain tasks, shader compilers and such can be a problem.keith12 said:At this point, I'd totally agree with you.
I guess the point I was trying to make was, that with Raja on board and Intel very obviously making a 'real go' at this GPU thing (;)), that there's more to come. I'd imagine he is taking the 'Intel' GPU driver issue seriously. It won't be good enough they keep to their current output of drivers. They will step up and I'm sure over the course of time, have driver certification on par and as regularly as AMD/Nvidia!
Hopefully, this is an early taste of what's to come with driver optimization and regular rollouts of 'Game ready' drivers.
Edit: @JarredWaltonGPU , in terms of general raster performance, would the performance of this card not roughly translate to the same/similar performance across the majority of games? (Incidental variances aside).
I realise that you can't be certain as each gam engine might respond differently, along with architecture differences. But, in general terms?
It's funny, because AMD's CPUs have to be 100% compatible with x86/x86-64, but somehow for graphics it's not quite so clear cut.
On paper, assuming 2.0GHz and higher clocks, Arc A-series GPUs should be pretty competitive. But there are so many architectural nuances that the paper specs can end up being completely meaningless. Like with the ReBAR thing on Arc A380 that people have discovered. I can't see a good reason why lack of ReBAR support on a platform could hurt performance as badly as it does, unless there's just a lot of poor memory management and other stuff going on in the drivers.
Put another way, the A750 that Intel showed benchmarks for has:
7% more memory bandwidth than the RTX 3060 (both are 12GB, 192-bit bus, but Intel is clocked at 16Gbps instead of Nvidia's 15Gbps)
Possibly 3.5% less compute performance than RTX 3060, depending on GPU clocks (and Nvidia's FP32 + FP32/INT32 pipeline split factors in)
If the architectures were perfectly comparable, we'd expect a very slight advantage for Intel if it clocks higher than 2.0GHz, like 2.25GHz for example. But there's absolutely no way that ends up being true, which means drivers and other elements come into play. -
jp7189
Let's also not forget game optimization is a 2 way street. Game studios have years of experience optimizing for nvidia and amd, which means those games will run well even before driver optimization. Intel has a lot to prove before they will merit the same treatment.JarredWaltonGPU said:I'm not sure why anyone would have a lot of faith in Raja. He was the man behind Vega, more or less, and while it wasn't bad it also wasn't great. But drivers are always a concern, with any GPU, and Intel just has a really poor track record. It's improving, but as recently as the Deathloop FSR 2.0 patch that game was having problems on all Intel GPUs. In theory, a driver just translates the high level DirectX calls into code that the GPU can run, but in practice I guess with infinite ways of accomplishing certain tasks, shader compilers and such can be a problem.
It's funny, because AMD's CPUs have to be 100% compatible with x86/x86-64, but somehow for graphics it's not quite so clear cut.
On paper, assuming 2.0GHz and higher clocks, Arc A-series GPUs should be pretty competitive. But there are so many architectural nuances that the paper specs can end up being completely meaningless. Like with the ReBAR thing on Arc A380 that people have discovered. I can't see a good reason why lack of ReBAR support on a platform could hurt performance as badly as it does, unless there's just a lot of poor memory management and other stuff going on in the drivers.
Put another way, the A750 that Intel showed benchmarks for has:
7% more memory bandwidth than the RTX 3060 (both are 12GB, 192-bit bus, but Intel is clocked at 16Gbps instead of Nvidia's 15Gbps)
Possibly 3.5% less compute performance than RTX 3060, depending on GPU clocks (and Nvidia's FP32 + FP32/INT32 pipeline split factors in)
If the architectures were perfectly comparable, we'd expect a very slight advantage for Intel if it clocks higher than 2.0GHz, like 2.25GHz for example. But there's absolutely no way that ends up being true, which means drivers and other elements come into play. -
-Fran- Intel should definitely do more of this:Reply
8ENCV4xUjj0View: https://www.youtube.com/watch?v=8ENCV4xUjj0
That helps so much in order to give nuance to why the GPUs behave like they do. I'd say AMD tries to do this with Mr. Hallock in a smaller scale, I guess?
Anyway, we need more of that, for sure. Good on Intel and kudos.
Regards. -
cryoburner
Yeah, looking at the A380 review that Gamers Nexus recently posted, that lower-end card performed rather competitively in F1 2021, outperforming the RX 6400 and not being too far behind the GTX 1650 in that game. But in most other games it performed worse than the 6400, and in the case of GTA5 it performed a lot worse, getting almost half the performance. Where the RX 6400 provided a mostly 60fps experience at 1080p, the A380 only managed a mostly 30fps experience in that game. They suggested that the card tended to underperform in games utilizing older APIs like DX11.JarredWaltonGPU said:The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.
F1 2021 was the only game in this chart that GN tested on the A380, but it happened to be the one where it performed the best relative to the competition, which is probably indicative of these games being a best-case scenario, cherry-picked to show the A750 at its most competitive to the 3060. If this card sees similarly unpredictable performance in some games though, like getting half the performance in GTA5, then that's likely to be a major turn-off. Perhaps that's part of why these cards haven't been released yet though, as Intel attempts to get the drivers in a somewhat more optimized state. I question whether that's something they can manage in a matter of months though.
None of Intel's benchmarks used ray tracing effects, despite Control, Cyberpunk 2077, and Fortnite supporting DXR (DirectX Raytracing). We'd really love to see some ray tracing benchmarks for Arc, if only to satisfy our own curiosity.
Logic would dictate that if Intel is keeping quiet about RT performance, it most likely compares unfavorably against at least Nvidia's implementation. AMD wasn't really talking much about the performance of their RT implementation during the run-up to the launch of the RX 6000 series either, and it predictably ended up under-performing compared to the competition. It's also possible that the cards may perform better at some RT effects than at others though. And perhaps their RT implementation just requires more driver work before they are willing to show it.