Intel Arc A750 Limited Edition Review: RTX 3050 Takedown

But AMD's RX 6600 still stands strong

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Intel will sell Arc graphics cards under its "Limited Edition" branding. The A750 LE looks nearly identical to the Arc A770 LE, except without all the RGB lighting. Some people might miss those lights, though the Intel Arc logo on top of the card still adds a white glow to your PC. Under the hood, of course, there are a few other changes that we've already covered.

The A750 LE has the same industrial design, with two custom 85mm fans. It weighs 1060g and measures 268x110x38mm — if you're keeping score, that's 20g less than the A770 LE, probably mostly due to the missing RGB lighting and diffuser.

(Image credit: Tom's Hardware)

For power, the A750 has dual PEG connectors, one 8-pin and one 6-pin. The review samples have black and gray colors on the connectors, but we're told the retail products will have the same color on both of them. Total board power (TBP) is rated at 225W, so just the two connectors can deliver that, with the potential 75W from the PCIe x16 slot providing headroom for overclocking.

Intel uses a large vapor chamber for the heatsink, which helps to keep the GPU and memory cool and tends to be more expensive than heatpipe designs. Still, we have to point out that Intel lists the RTX 3060 as the direct competition, a card with a 170W TBP. It's also made on an arguably inferior Samsung 8N (tuned 10nm for Nvidia's use) process, whereas Intel uses TSMC N6. Then again, the GA106 chip in the 3060 only measures 276mm^2 compared to Intel's 406mm^2. So despite targeting relatively similar performance, Intel's card uses more power and is almost certainly more expensive to make.

All of Intel's Arc GPUs support three DisplayPort 2.0 UHBR 10 outputs, plus a single HDMI 2.0b port — you can get a DisplayPort to HDMI 2.1 adapter if you want HDMI 2.1, though even that wouldn't support the full 48 Gbps maximum bandwidth of a proper HDMI 2.1 port. Intel's DP2.0 ports max out at 40 Gbps — still enough for 8K 60 Hz with DSC (Display Stream Compression).

I personally like the look and design of Intel's Arc Limited Edition cards. They're understated and don't scream for attention, and the cooling setup looks reasonably capable. However, if you love RGB lighting, you'll probably be more interested in the third-party cards from the likes of ASRock and Gunnir, and Acer is also joining the graphics card market.

Is that because Acer has always wanted to make a graphics card, or is it because Intel needs a big OEM partner to help move Arc GPUs? Perhaps a little of both, though we suspect the latter was a bigger factor.

Jarred Walton

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • cknobman
    Performance numbers better than expected.
    Power usage and temperatures are less than desired.
    Reply
  • tennis2
    TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.

    My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?

    I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
    Reply
  • JarredWaltonGPU
    tennis2 said:
    TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.

    My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?

    I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
    Intel will continue to do integrated graphics for sure. That means they'll still make drivers. But will they keep up with changes on the dGPU side if they pull out? Probably not.

    I don't really think they're going to ax the GPU division, though. Intel needs high density compute, just like Nvidia needs its own CPU. There are big enterprise markets that Intel has been locked out of for years due to not having a proper solution. Larrabee was supposed to be that option, but when it morphed into Xeon Phi and then eventually got axed, Intel needed a different alternative. And x86 compatibility on something like a GPU (or Xeon Phi) is going to be more of a curse than a blessing.

    I really do want Intel to stay in the GPU market. Having a third competitor will be good. Hopefully Battlemage rights many of the wrongs in Alchemist.
    Reply
  • InvalidError
    About the same performance per dollar as far more mature options in the same pricing brackets, not really worth bothering with unless you wish to own a small piece of computing history.
    Reply
  • Giroro
    So what's the perf/$ chart look like without Ray Tracing results included?

    I mean I love Control and everything, but I've been done with it for years. I googled "upcoming ray tracing games" and the top result was still that original list from 2019.
    There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.

    Also, I'm not really understanding how the hypothetical system cost that was discussed would be factored into the math.
    Reply
  • InvalidError
    Giroro said:
    There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
    Chicken-and-egg problem: game developers don't want to bother with RT because most people don't have RT-capable hardware, hardware designers limit emphasis on RT for cost-saving reasons since very little software will be using it in the foreseeable future.

    As more affordable yet sufficiently powerful RT hardware becomes capable of pushing 60+FPS at FHD or higher resolutions, we'll see more games using.

    It was the same story with pixel/vertex shaders and unified shaders. Took a while for software developers to migrate from hard-wired T&L to shaders, give it a few year and now fixed-function T&L hardware is deprecated.

    Give it another 5-7 years and we'll likely get new APIs designed with RT as the primary render flow.
    Reply
  • drajitsh
    Admin said:
    The Intel Arc A750 goes after the sub-$300 market with compelling performance and features, with a slightly trimmed down design compared to the A770. We've tested Intel's new value oriented wunderkind and found plenty to like.

    Intel Arc A750 Limited Edition Review: RTX 3050 Takedown : Read more
    @jaredwaltonGPU
    Hi, I have some questions and a request
    Does this support PCIe 3.0x16.
    For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU
    Reply
  • krumholzmax
    REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...
    Reply
  • boe rhae
    krumholzmax said:
    REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...

    I have absolutely no idea what this says.
    Reply
  • ohio_buckeye
    I don't need a card at the moment since I've got a 6700xt, but the new intel cards are interesting. If they stay around with them, I might consider a purchase of one on my next upgrade if they are decent to help a 3rd player stay in.
    Reply