Why you can trust Tom's Hardware
To hear Intel's take on the new Arc graphics cards, the A750 is the better value compared to the A770. I'm not convinced that's the real story. Yes, by a pure FPS per dollar metric, the A750 beats the A770. But that might not be the best way of looking at things.
Your graphics card doesn't exist in a vacuum — it has to go into a PC. So the cost of the rest of the PC should certainly be considered, or at least some fraction of the cost. Then there are things like the extra VRAM on the A770 LE, which definitely helped smooth out performance in at least a few of the benchmarks, not to mention driver concerns.
But let's go ahead and put together a value chart. I will use FPS/$ for the graphics card, with performance being the geometric mean of the 1080p and 1440p standard performance results and 1080p medium ray tracing performance. That eliminates any GPU that doesn't support ray tracing, and I also dropped cards that aren't currently being sold brand-new at retail. Here are the results:
Rank | Graphics Card | Retail Price (Sept. 2022) | Overall Perf | Perf/$ |
---|---|---|---|---|
1 | Intel Arc A750 | $289 | 68.7 | 0.2378 |
2 | Radeon RX 6600 | $248 | 57.7 | 0.2326 |
3 | Radeon RX 6650 XT | $300 | 69.6 | 0.2319 |
4 | Intel Arc A770 | $349 | 78.1 | 0.2239 |
5 | GeForce RTX 2060 | $237 | 52.8 | 0.2223 |
6 | GeForce RTX 3060 Ti | $400 | 88.8 | 0.2221 |
7 | Radeon RX 6600 XT | $310 | 67.9 | 0.2191 |
8 | Radeon RX 6700 XT | $420 | 84 | 0.2 |
9 | GeForce RTX 2060 Super | $320 | 61.7 | 0.1928 |
10 | Radeon RX 6750 XT | $470 | 89.1 | 0.1896 |
11 | Intel Arc A380 | $140 | 26.3 | 0.1879 |
12 | GeForce RTX 3070 | $520 | 97.3 | 0.1871 |
13 | Radeon RX 6800 XT | $600 | 111.4 | 0.1857 |
14 | GeForce RTX 3060 | $369 | 68 | 0.184 |
15 | Radeon RX 6800 | $550 | 99.9 | 0.1817 |
16 | GeForce RTX 3050 | $290 | 49.5 | 0.1708 |
17 | GeForce RTX 3070 Ti | $599 | 102.3 | 0.1707 |
18 | GeForce RTX 2070 | $380 | 64.6 | 0.1701 |
19 | Radeon RX 6900 XT | $700 | 117.1 | 0.1673 |
20 | GeForce RTX 3080 12GB | $750 | 124.7 | 0.1662 |
21 | GeForce RTX 3080 | $730 | 116.7 | 0.1599 |
22 | Radeon RX 6500 XT | $160 | 24.8 | 0.1548 |
23 | GeForce RTX 3080 Ti | $821 | 123.9 | 0.1509 |
24 | GeForce RTX 3090 | $950 | 127 | 0.1337 |
25 | Radeon RX 6950 XT | $950 | 125.8 | 0.1324 |
26 | GeForce RTX 3090 Ti | $1,100 | 133.9 | 0.1217 |
And there you have it: the Intel Arc A750 is the best value graphics card right now — using some rather questionable math. What if you factor in a system cost of $200? That puts the RTX 3060 Ti into the top slot, with the Arc A770 in second place. Or a $400 system cost would make the RX 6700 XT the top value, followed by the 3060 Ti and the 3080 12GB; Intel's Arc A770 lands in seventh place, with the A750 in 13th place.
The point isn't that only one of the above value ranking scenarios is the right way of doing things, but that there are lots of possibilities. If keeping costs down is your most important criteria for a graphics card, and assuming you don't already have something, AMD's Navi 23 GPUs are tough to beat. Still, by some metrics, the A750 certainly warrants consideration.
The real benefit of having Intel Arc A750 entering the market is that it will pressure AMD and Nvidia to lower the prices on other cards. Maybe we all would have appreciated it more last year when even older cards like the GTX 1050 Ti were selling for $300 or more, but having more competition in the GPU space would be good.
Intel doesn't get a free pass on its drivers, though, which is something any gamer seriously considering an Arc GPU needs to factor into the buying decision. Arc could be the best value proposition and then some, but if Intel doesn't keep improving the drivers, you could end up with a lemon.
Hopefully that won't be the case. Intel says it's working hard on the next generation of Arc GPUs now, codenamed Battlemage. However, it's also learned a lesson about promising things too early, so there's no firm release day for Battlemage yet. Maybe the first parts will arrive next year and tackle the high-end segment as well, but if performance increases, then it's a safe bet the pricing will also be higher.
For now, we can't help but be a bit excited about Arc. New GPUs architectures normally only come once every two years, and we're due for AMD and Nvidia updates in the very near future. Intel might be the 800-pound gorilla of the CPU world, but when it comes to graphics, it's definitely the underdog. We'll see if it can mark out some territory in the budget and midrange segments, and maybe even manage to keep AMD and Nvidia a bit more "honest" about pricing.
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Intel Arc A750: An Affordable Alternative GPU
Prev Page Intel Arc A750: Power, Temps, Noise, Etc.Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
cknobman Performance numbers better than expected.Reply
Power usage and temperatures are less than desired. -
tennis2 TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.Reply
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12.... -
JarredWaltonGPU
Intel will continue to do integrated graphics for sure. That means they'll still make drivers. But will they keep up with changes on the dGPU side if they pull out? Probably not.tennis2 said:TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
I don't really think they're going to ax the GPU division, though. Intel needs high density compute, just like Nvidia needs its own CPU. There are big enterprise markets that Intel has been locked out of for years due to not having a proper solution. Larrabee was supposed to be that option, but when it morphed into Xeon Phi and then eventually got axed, Intel needed a different alternative. And x86 compatibility on something like a GPU (or Xeon Phi) is going to be more of a curse than a blessing.
I really do want Intel to stay in the GPU market. Having a third competitor will be good. Hopefully Battlemage rights many of the wrongs in Alchemist. -
InvalidError About the same performance per dollar as far more mature options in the same pricing brackets, not really worth bothering with unless you wish to own a small piece of computing history.Reply -
Giroro So what's the perf/$ chart look like without Ray Tracing results included?Reply
I mean I love Control and everything, but I've been done with it for years. I googled "upcoming ray tracing games" and the top result was still that original list from 2019.
There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
Also, I'm not really understanding how the hypothetical system cost that was discussed would be factored into the math. -
InvalidError
Chicken-and-egg problem: game developers don't want to bother with RT because most people don't have RT-capable hardware, hardware designers limit emphasis on RT for cost-saving reasons since very little software will be using it in the foreseeable future.Giroro said:There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
As more affordable yet sufficiently powerful RT hardware becomes capable of pushing 60+FPS at FHD or higher resolutions, we'll see more games using.
It was the same story with pixel/vertex shaders and unified shaders. Took a while for software developers to migrate from hard-wired T&L to shaders, give it a few year and now fixed-function T&L hardware is deprecated.
Give it another 5-7 years and we'll likely get new APIs designed with RT as the primary render flow. -
drajitsh
@jaredwaltonGPUAdmin said:The Intel Arc A750 goes after the sub-$300 market with compelling performance and features, with a slightly trimmed down design compared to the A770. We've tested Intel's new value oriented wunderkind and found plenty to like.
Intel Arc A750 Limited Edition Review: RTX 3050 Takedown : Read more
Hi, I have some questions and a request
Does this support PCIe 3.0x16.
For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU -
krumholzmax REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...Reply -
boe rhae krumholzmax said:REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...
I have absolutely no idea what this says. -
ohio_buckeye I don't need a card at the moment since I've got a 6700xt, but the new intel cards are interesting. If they stay around with them, I might consider a purchase of one on my next upgrade if they are decent to help a 3rd player stay in.Reply