Intel Demos Arc A770 GPU, Leaves Old APIs In The Dust

Intel Arc A770
Intel Arc A770 (Image credit: Linus Tech Tips)

Intel executives Ryan Shrout and Tom Petersen have dropped some nuggets regarding the chipmaker's Arc Alchemist graphics cards in the latest installment of Linus Tech Tips. The flagship Arc A770, which will compete against the best graphics cards, made a brief appearance, offering a sneak peek of the graphics card's performance.

Petersen explained that Intel had implemented a three-tier strategy regarding game optimizations for its Arc Alchemist graphics cards. Tier 1 includes titles that work exceptionally with Arc. On the other hand, tier 2 encompasses titles that are less optimized but based on modern APIs, such as DirectX12 and Vulkan. These games should perform pretty well. The biggest concern for gamers are titles that fall into the tier 3 category. Arc graphics cards underperform in tier 3 titles, which correspond to games on DirectX11 or earlier versions.

For reference, the top five games on Steam, according to Steam Charts, use DirectX11 and fall into Intel's tier 3 classification. The only good news from all this tier talk is that Intel will reportedly price Arc graphics cards based on internal testing of tier 3 games.

We've already seen a few of the cherry-picked tier 1 titles in Intel-provided benchmarks for the Arc A750. The list consists of F1 2021, Cyberpunk 2077, Control, Borderlands 3, and Fortnite. Peterson enthusiastically stated that "we're gonna kill everyone in price to performance" with tier 1 games. The jury is still out on that one, though. For example, the Arc A380 sells for $192 in China. For comparison, the Radeon RX 6400 retails for $149.99 and outperforms the Arc A380. It'll be interesting to see what Intel charges for the other Arc models.

Linus estimated that the Arc A770 was pumping out between 50 to 60 FPS in Cyberpunk 2077 on the High preset at 1440p (2560 x 1440). Additionally, the graphics card pushed frame rates to 180 FPS on F1 2021 at "highish" settings at 1440p. Unfortunately, Intel didn't allow Linus to show off any tier 2 games. However, the celebrity YouTuber did provide a comparison between DirectX11 and DirectX12 in Shadow of the Tomb Raider, allowing us to see how much performance Arc is leaving on the table on older APIs.

The Arc A770 delivered around 80 FPS during the DirectX12 benchmark. On DirectX11, however, the graphics card could only put out 40 FPS. That's a substantial performance difference as the Arc A770 lost half of its performance on the older API. That's just poor optimization. Sadly, that's the current reality for Arc adopters. We can only hope that Intel's team of engineers can work out the kinks in the Arc driver software by the time the graphics cards debut.

During the overclocking segment, Linus' video revealed some details on the Arc A770. The graphics card has a 2.5 GHz clock speed and a GPU power up to 190W. Applying the 20% GPU Performance Boost profile upped the power limit to 285W and the temperature limit to 125 degrees Celsius. Linus reported smoother gameplay but once again didn't provide any numbers. Next, he tried to push the Arc A770 with the more aggressive 30% profile, but the graphics card crashed.

Intel has confirmed that it didn't have any plans to release an Arc A780 so that the Arc A770 will carry the flagship banner. The Arc A770 has the potential to match the GeForce RTX 3070 or Radeon RX 6700 XT. Desktop Arc will hit the shelves in Q3 of this year, so we'll find out soon enough.

Zhiye Liu
RAM Reviewer and News Editor

Zhiye Liu is a Freelance News Writer at Tom’s Hardware US. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • wifiburger
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

    Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

    The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
    And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
    Reply
  • UnCertainty08
    You are correct. That's why I stick to Nvidia as well. Nothing worse than paying top dollar for hardware that doesn't work because of software issue. Also part of the reason I stick to intel CPU's.
    Reply
  • cryoburner
    wifiburger said:
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.
    But as the article (and video) pointed out, Intel will supposedly be pricing the cards based on their performance in titles utilizing older APIs where they don't perform as well. So you will theoretically get the performance you are paying for in DX11 and DX9 titles, but get additional performance in DX12/Vulcan. So in titles supporting those newer APIs, particularly ones Intel has optimized for, you might potentially get performance comparable to a higher-tier graphics card than you paid for. "Inconsistent" performance wouldn't be bad if the inconsistencies tend to err on the side of giving you additional performance for your money.

    Another thing to consider is that if their performance in older, higher-overhead APIs is holding the cards back at launch, that leaves them with a lot of room to potentially improve performance in the future. It's possible that the cards could age a lot better than similarly-priced models from the competition. That is, so long as Intel keeps updating and optimizing the drivers for these first-generation cards for years to come. Even without optimizing much for older games though, as newer titles focus on newer APIs, and game developers begin optimizing specifically for the hardware, the performance situation is likely to improve.

    Like I said previously, Intel tends to price their hardware competitively when entering into new markets, and the big price mark-ups on competing cards leaves them with a lot of flexibility to impress in terms of pricing. Of course, there are still many unknowns, like how raytraced effects will perform, and whether the hardware holds potential for that to improve in the future. And also whether features that have become the norm on AMD and Nvidia cards will be present, and function as expected. There could potentially be quirks that make one think twice about trialing these cards. So I would fully expect Intel to price them competitively if they hope to establish a presence in the market.
    Reply
  • LastStanding
    As soon as Intel (and Apple) releases a GPU that matches, or best, NVIDIA's current best flagship on the market, all the DX9-11 non-supported titles complaints from gamers will not even matter anymore.

    It's not like all these old relic titles' devs/pubs are just going to go back and add support to Intel (nor Apple's M architecture) new infrastructure. 🙄
    Reply
  • LuxZg
    Actually, as soon as performance in older titles is "good enough" it will stop to matter. Those top 5 Steam titles sound harsh, until you realize one is CS GO (350+ FPS with 3070) other is Dota (250+). So for 1440p even "bad" performance could be good enough for most gamers. And even if the issues persist to Battlemage generation, it will be even less pronounced. Sure some will check FPS and benches and complain, but many will just play. Specially in OEM builds, as most "real gamers" (geeks, techies, the vocal crowd) won't even look at OEM builds. But many casual gamers, yeah, they won't care. I just keep wondering about the prices, because as article says, they aren't delivering vs eg 6400. Prices keeping to tumble, and with next gen in a few months, will they price 770 cards at 300$ or less?

    Oh and BTW, that must be a typo in article with sentence comparing 770 to 3070 and 6500 XT (?)
    Reply
  • The Historical Fidelity
    wifiburger said:
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

    Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

    The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
    And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
    I’ve been an nvidia money giver my entire computing life starting with a GeForce 4400 TI, then FX 5800, twin GT 7800s, GTX 260, GTX 470, twin GTX 680s, GTX 980, GTX 1080. For this generation though, I risked buying an AMD 6900xt and I’ve been super happy with the card and the drivers.
    Reply
  • KyaraM
    So. I read a pretty interesting article about the A380 today in comparison to the GTX 1650. Sadly, it's a German article, so I will essentially just summarize the article and then link it below.

    So what they did was testing both cards in the Tier 1 games, and then OC the A380. In contrast to this article, though, they gave actual performance improvements. Clock speeds improved by a modest 150MHz over stock, power draw from 35 to 55W. All they did was raise the slider "GPU Performance Boost" to 55%. The boost to performance was pretty steep, 37% according to the website.

    a/sxeZCgPView: https://imgur.com/gallery/sxeZCgP

    However, as you can see, the A380 suddenly matches the 1650 quite well with the same settings, while supposedly still being more frugal. If estimated release prices are correct at $175... and if the same can be done with the A770...

    Link to the article
    https://www.notebookcheck.com/Eine-uebertaktete-Intel-Arc-A380-tritt-im-Gaming-Vergleich-gegen-die-Nvidia-GeForce-GTX-1650-an.635575.0.html
    Edit: none of the above is my own stuff, they belong to the authors of the article and the dude who performed the test.
    Reply
  • jp7189
    "temperature limit to 125 degrees Celsius" ?! :oops:
    Reply
  • cyrusfox
    jp7189 said:
    "temperature limit to 125 degrees Celsius" ?! :oops:
    All the coolers we have seen are overbuilt, you would have to starve it of air to get it to reach that damaging temp.

    I really hope we can get a water block on one of the top end cards .I only do water on my GPUs whether they need it or not.
    Reply
  • Eximo
    LuxZg said:
    Oh and BTW, that must be a typo in article with sentence comparing 770 to 3070 and 6500 XT (?)

    The link is correct to the 6700XT review, the text is wrong.
    Reply