Why you can trust Tom's Hardware
For our power, temperature, clock speed, and fan speed testing, we use Powenetics hardware and software. We capture in-line GPU power consumption by collecting data while looping Metro Exodus — at 1440p ultra with midrange and higher GPUs. We also test with the FurMark stress test at 1600x900. Our power testing PC uses an open testbed, as that's required for all the extra wires and riser card, and it's the same Core i9-9900K that we've used for the past several years.
As noted earlier, we had to install a second graphics card in the PC to get the Arc GPUs to POST, but we disabled the card in Device Manager and our equipment doesn't include the power use from the other PCIe slot. GPU-Z meanwhile monitors the fan speeds, clock speeds, and other aspects of the cards.
Despite having slower clocks and less memory, the Arc A750 only uses slightly less power than the A770 Limited Edition in Metro Exodus, and in FurMark, it actually used more power. That's probably due to binning, where the A770 gets the best chips from a wafer, while the chips used in the A750 may be harvested due to defects or other characteristics.
The A750 used 232W on average in FurMark and 210W in Metro Exodus. That might not seem like a big deal on its own, but look at Nvidia's RTX 3060 Ti and 3070. Both easily surpass the Arc GPUs in performance, yet they also use the same amount of power in the case of the 3070, or less power in the case of the 3060 Ti. It's probably a case of Intel just not being quite as familiar with GPU tuning as Nvidia and AMD, at least for this first-generation dedicated GPU.
As noted in the A770 LE review, idle power draw on the Arc GPUs was also quite a bit higher than the competition. Where the RTX 3060 used 15W and the RX 6650 XT only needed 7.3W on average while idle, we measured power use of 45.1W on the A750 LE. Intel says it's aware of the issue and is working to address it, so hopefully a future driver update can curb the idle power use. Related to this, the fans on the Arc cards generally don't fully stop, or at least stay fully stopped, due to the higher idle power use.
Clock speeds for the A750 are well above the advertised 2050 MHz boost clock, with FurMark averaging over 2.3 GHz and Metro getting close to 2.4 GHz. It's also interesting that the A750 clocked higher than the A770 in Metro.
Binning of the chips also appears to be a factor in the temperatures and fan speeds we recorded. The A750 ran slightly hotter than the A770 in both FurMark and Metro while also running higher fan speeds. Perhaps our particular A750 is a relatively poor sample, or the A770 is a cherry sample, but it certainly looks like the A770 will behave far better in more tasks.
We also test noise levels with an SPL meter (sound pressure level) aimed between the two fans at a distance of about 10cm. That helps to minimize the impact of the CPU cooling fans (the extra graphics card had its fans stopped and wasn't a factor).
We measured idle noise of 31.3 dB(A), which is basically the limit of our SPL meter — anything below 30 dB(A) is out of range. The Arc GPUs support having the fans stop completely when the chips are cool enough, but idle power is a bit high (around 50W right now), which results in the fans almost always spinning. So under load, the A750 got a bit louder than the A770, with 47.8 dB(A) compared to 46.0 dB(A).
- MORE: Best Graphics Cards
- MORE: GPU Benchmarks and Hierarchy
- MORE: All Graphics Content
Current page: Intel Arc A750: Power, Temps, Noise, Etc.
Prev Page Intel Arc A750 XeSS Performance Next Page Intel Arc A750: An Affordable Alternative GPUJarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
-
cknobman Performance numbers better than expected.Reply
Power usage and temperatures are less than desired. -
tennis2 TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.Reply
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12.... -
JarredWaltonGPU
Intel will continue to do integrated graphics for sure. That means they'll still make drivers. But will they keep up with changes on the dGPU side if they pull out? Probably not.tennis2 said:TBH, not an unexpected outcome for their first product. The DX12 emulation was a strange choice, forward-thinking sure, but not at that much cost to older games they know reviewers are still testing on. Was wishing/hoping Intel's R&D budget could've gotten a little closer to market parity (I'm sure they did also for pricing) but I don't know what their R&D budget was for this project. Seems like their experience in IGP R&D could've been better extrapolated into discrete cards, but apparently not.
My biggest concern is future support. They said they're committed to dGPUs, but this product line clearly didn't live up to their expectations. Unless we're all being horribly lied to on GPU pricing, it doesn't seem like Intel is making much/any money on the A750/770. Certainly not as much as they'd hoped. If next gen is a flop also.....who knows, maybe they call it quits. Then what? Would they still provide driver updates? For how long?
I do wonder what % of games released in the past 2 years (say top 100 from each year) are DX12....
I don't really think they're going to ax the GPU division, though. Intel needs high density compute, just like Nvidia needs its own CPU. There are big enterprise markets that Intel has been locked out of for years due to not having a proper solution. Larrabee was supposed to be that option, but when it morphed into Xeon Phi and then eventually got axed, Intel needed a different alternative. And x86 compatibility on something like a GPU (or Xeon Phi) is going to be more of a curse than a blessing.
I really do want Intel to stay in the GPU market. Having a third competitor will be good. Hopefully Battlemage rights many of the wrongs in Alchemist. -
InvalidError About the same performance per dollar as far more mature options in the same pricing brackets, not really worth bothering with unless you wish to own a small piece of computing history.Reply -
Giroro So what's the perf/$ chart look like without Ray Tracing results included?Reply
I mean I love Control and everything, but I've been done with it for years. I googled "upcoming ray tracing games" and the top result was still that original list from 2019.
There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
Also, I'm not really understanding how the hypothetical system cost that was discussed would be factored into the math. -
InvalidError
Chicken-and-egg problem: game developers don't want to bother with RT because most people don't have RT-capable hardware, hardware designers limit emphasis on RT for cost-saving reasons since very little software will be using it in the foreseeable future.Giroro said:There's so few noteworthy RT games, that I'm surprised that Intel and the next gen cards are even bothering to support it.
As more affordable yet sufficiently powerful RT hardware becomes capable of pushing 60+FPS at FHD or higher resolutions, we'll see more games using.
It was the same story with pixel/vertex shaders and unified shaders. Took a while for software developers to migrate from hard-wired T&L to shaders, give it a few year and now fixed-function T&L hardware is deprecated.
Give it another 5-7 years and we'll likely get new APIs designed with RT as the primary render flow. -
drajitsh
@jaredwaltonGPUAdmin said:The Intel Arc A750 goes after the sub-$300 market with compelling performance and features, with a slightly trimmed down design compared to the A770. We've tested Intel's new value oriented wunderkind and found plenty to like.
Intel Arc A750 Limited Edition Review: RTX 3050 Takedown : Read more
Hi, I have some questions and a request
Does this support PCIe 3.0x16.
For Low end GPU could you select a low end GPU like my Ryzen 5700G. this would tell me 3 things -- support for AMD, Support for PCIe 3.0, and use for low end CPU -
krumholzmax REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...Reply -
boe rhae krumholzmax said:REALLY THIS IS PLENTY GOOD? Drivers not working market try to AMD and NVIDIA BETTER AND COST LEST _ WHY SO BIG CPU ON CARD 5 Years ago by performance. Who will buy it? Other checkers say all about this j...
I have absolutely no idea what this says. -
ohio_buckeye I don't need a card at the moment since I've got a 6700xt, but the new intel cards are interesting. If they stay around with them, I might consider a purchase of one on my next upgrade if they are decent to help a 3rd player stay in.Reply