Tom's Hardware Verdict
The Intel Arc A750 makes the RTX 3050 look pathetic and even gives some tough competition to the 3060, though AMD's RX 6600 makes Intel's offering a little less enticing. Still, it's good to see competition from Intel in the GPU space.
Pros
- +
Strong 1080p performance for a reasonable price
- +
Minimalist design eschews gaudy RGB lighting
- +
Excellent media capabilities
Cons
- -
Potential driver pitfalls
- -
Only 8GB of VRAM
- -
Not particularly power efficient
Why you can trust Tom's Hardware
All aboard the Intel Arc! Or Ark? Intel's first true dedicated graphics cards are here, and we've got the Arc A750 review along with the Arc A770 review. Intel has set its sights on the value midrange market with the A750, hoping to earn a spot among the best graphics cards. We've thoroughly tested the card and have come away far more impressed than we were with the budget-friendly Arc A380.
This is a companion review to the A770, so there's a lot of additional detail in that article. You can also check out our deep dive into the Intel Arc Alchemist architecture. For the A750, we'll skip straight to the important bits and render our verdict.
Intel Arc A750 Specifications
Here's the quick rundown of the full Intel Arc desktop card lineup. The A580 hasn't launched yet, so we're missing some information like the price, but everything else should now be available. That's not quite true, as the A750 and A770 will actually go on sale next week, on October 12 — right alongside the Nvidia RTX 4090, which promises gobs more performance for slightly more than five times the price of the Arc A750. It's going to be a tough decision, we know!
Graphics Card | Arc A770 16GB | Arc A770 8GB | Arc A750 | Arc A580 | Arc A380 |
---|---|---|---|---|---|
Architecture | ACM-G10 | ACM-G10 | ACM-G10 | ACM-G10 | ACM-G11 |
Process Technology | TSMC N6 | TSMC N6 | TSMC N6 | TSMC N6 | TSMC N6 |
Transistors (Billion) | 21.7 | 21.7 | 21.7 | 21.7 | 7.2 |
Die size (mm^2) | 406 | 406 | 406 | 406 | 157 |
Xe-Cores | 32 | 32 | 28 | 24 | 8 |
GPU Shaders | 4096 | 4096 | 3584 | 3072 | 1024 |
Matrix Cores | 512 | 512 | 448 | 384 | 128 |
Ray Tracing Units | 32 | 32 | 28 | 24 | 8 |
Boost Clock (MHz) | 2100 | 2100 | 2050 | 1700 | 2000 |
VRAM Speed (Gbps) | 17.5 | 16 | 16 | 16 | 15.5 |
VRAM (GB) | 16 | 8 | 8 | 8 | 6 |
VRAM Bus Width | 256 | 256 | 256 | 256 | 96 |
L2 Cache | 16 | 16 | 16 | 16 | 6 |
ROPs | 128 | 128 | 128 | 128 | 32 |
TMUs | 256 | 256 | 224 | 192 | 64 |
TFLOPS FP32 | 17.2 | 17.2 | 14.7 | 10.4 | 4.1 |
TFLOPS FP16 (INT8) | 138 (275) | 138 (275) | 118 (235) | 84 (167) | 33 (66) |
Bandwidth (GB/s) | 560 | 512 | 512 | 512 | 186 |
TDP (watts) | 225 | 225 | 225 | 175 | 75 |
Launch Date | October 2022 | October 2022 | October 2022 | ? | June 2022 |
Launch Price | $349 | $329 | $289 | ? | $139 |
The Intel Arc A750 follows the familiar pattern of taking the same GPU and core design of a more expensive model and then trimming down some features. The A750 disables four of the potential 32 Xe-Cores and has a slightly lower boost clock, giving it about 85% of the theoretical compute of the A770. It also has half the memory of the 16GB A770 Limited Edition, clocked at 16 Gbps instead of 17.5 Gbps, so that's 9% less memory bandwidth. Memory capacity is our bigger concern.
8GB of VRAM was great back in 2016 when the GTX 1070 and 1080 launched. However, six years later, we're not quite as keen on only having 8GB of memory. Granted, the RX 6600-series cards from AMD are all packing 8GB, and everything from the RTX 3050 through the RTX 3070 Ti — with the exception of the RTX 3060 — also has 8GB. But one look at where games are heading with VRAM use, and we can't help but think the A770 16GB is probably worth the extra $60. The extra memory could definitely help big brother leave his little sibling sucking wind.
Looking at the direct competition, which, based on current GPU prices, would be the AMD RX 6600 or RX 6650 XT and the Nvidia RTX 3050, things are a bit messy. Let's just get this out of the way and say that the RTX 3050 ends up hopelessly outclassed. That was already true with the RX 6600, and the Arc A750 can pour some salt into the wound. But the AMD cards aren't going to roll over so easily.
In short, AMD promises good performance and excellent value for people that aren't worried about ray tracing or fancy schmancy AI upscaling technologies. On the other hand, Intel and Nvidia offer much better ray tracing performance along with matrix cores that can boost machine learning and AI performance.
With 28 Xe-Cores and a nominal 2050 MHz boost clock — and we say "nominal" because, as you'll see later, the Arc A750 easily exceeded that mark in our testing — Intel offers 14.7 teraflops of graphics compute performance. For deep learning and AI workloads, the A750 can perform 118 teraflops of FP16 calculations, or double that for INT8 work with 235 teraops of compute. Don't worry too much about the loss in precision, as companies like Google and Facebook have proven that 8-bits is sufficient for such work.
By comparison, AMD's RX 6650 XT has 10.8 teraflops of compute, while the RTX 3050 putters along with a meager 9.0 teraflops. Of course, that's on paper, and you can't just look at theoretical numbers to determine a winner — we've seen AMD's RDNA 2 chips punch well above their theoretical specs over the past two years. And that's why we run the benchmarks.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
cknobman Performance numbers better than expected.Reply
Power usage and temperatures are less than desired. -
tennis2 TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.Reply
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12.... -
JarredWaltonGPU
Intel will continue to do integrated graphics for sure. That means they'll still make drivers. But will they keep up with changes on the dGPU side if they pull out? Probably not.tennis2 said:TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
I don't really think they're going to ax the GPU division, though. Intel needs high density compute, just like Nvidia needs its own CPU. There are big enterprise markets that Intel has been locked out of for years due to not having a proper solution. Larrabee was supposed to be that option, but when it morphed into Xeon Phi and then eventually got axed, Intel needed a different alternative. And x86 compatibility on something like a GPU (or Xeon Phi) is going to be more of a curse than a blessing.
I really do want Intel to stay in the GPU market. Having a third competitor will be good. Hopefully Battlemage rights many of the wrongs in Alchemist. -
InvalidError About the same performance per dollar as far more mature options in the same pricing brackets, not really worth bothering with unless you wish to own a small piece of computing history.Reply -
Giroro So what's the perf/$ chart look like without Ray Tracing results included?Reply
I mean I love Control and everything, but I've been done with it for years. I googled "upcoming ray tracing games" and the top result was still that original list from 2019.
There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
Also, I'm not really understanding how the hypothetical system cost that was discussed would be factored into the math. -
InvalidError
Chicken-and-egg problem: game developers don't want to bother with RT because most people don't have RT-capable hardware, hardware designers limit emphasis on RT for cost-saving reasons since very little software will be using it in the foreseeable future.Giroro said:There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
As more affordable yet sufficiently powerful RT hardware becomes capable of pushing 60+FPS at FHD or higher resolutions, we'll see more games using.
It was the same story with pixel/vertex shaders and unified shaders. Took a while for software developers to migrate from hard-wired T&L to shaders, give it a few year and now fixed-function T&L hardware is deprecated.
Give it another 5-7 years and we'll likely get new APIs designed with RT as the primary render flow. -
drajitsh
@jaredwaltonGPUAdmin said:The Intel Arc A750 goes after the sub-$300 market with compelling performance and features, with a slightly trimmed down design compared to the A770. We've tested Intel's new value oriented wunderkind and found plenty to like.
Intel Arc A750 Limited Edition Review: RTX 3050 Takedown : Read more
Hi, I have some questions and a request
Does this support PCIe 3.0x16.
For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU -
krumholzmax REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...Reply -
boe rhae krumholzmax said:REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...
I have absolutely no idea what this says. -
ohio_buckeye I don't need a card at the moment since I've got a 6700xt, but the new intel cards are interesting. If they stay around with them, I might consider a purchase of one on my next upgrade if they are decent to help a 3rd player stay in.Reply