Intel Demos Arc A770 GPU, Leaves Old APIs In The Dust

Intel executives Ryan Shrout and Tom Petersen have dropped some nuggets regarding the chipmaker's Arc Alchemist graphics cards in the latest installment of Linus Tech Tips. The flagship Arc A770, which will compete against the best graphics cards, made a brief appearance, offering a sneak peek of the graphics card's performance.

Petersen explained that Intel had implemented a three-tier strategy regarding game optimizations for its Arc Alchemist graphics cards. Tier 1 includes titles that work exceptionally with Arc. On the other hand, tier 2 encompasses titles that are less optimized but based on modern APIs, such as DirectX12 and Vulkan. These games should perform pretty well. The biggest concern for gamers are titles that fall into the tier 3 category. Arc graphics cards underperform in tier 3 titles, which correspond to games on DirectX11 or earlier versions.

During the overclocking segment, Linus' video revealed some details on the Arc A770. The graphics card has a 2.5 GHz clock speed and a GPU power up to 190W. Applying the 20% GPU Performance Boost profile upped the power limit to 285W and the temperature limit to 125 degrees Celsius. Linus reported smoother gameplay but once again didn't provide any numbers. Next, he tried to push the Arc A770 with the more aggressive 30% profile, but the graphics card crashed.

Zhiye Liu
News Editor and Memory Reviewer

Zhiye Liu is a news editor and memory reviewer at Tom’s Hardware. Although he loves everything that’s hardware, he has a soft spot for CPUs, GPUs, and RAM.

  • wifiburger
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

    Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

    The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
    And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
    Reply
  • UnCertainty08
    You are correct. That's why I stick to Nvidia as well. Nothing worse than paying top dollar for hardware that doesn't work because of software issue. Also part of the reason I stick to intel CPU's.
    Reply
  • cryoburner
    wifiburger said:
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.
    But as the article (and video) pointed out, Intel will supposedly be pricing the cards based on their performance in titles utilizing older APIs where they don't perform as well. So you will theoretically get the performance you are paying for in DX11 and DX9 titles, but get additional performance in DX12/Vulcan. So in titles supporting those newer APIs, particularly ones Intel has optimized for, you might potentially get performance comparable to a higher-tier graphics card than you paid for. "Inconsistent" performance wouldn't be bad if the inconsistencies tend to err on the side of giving you additional performance for your money.

    Another thing to consider is that if their performance in older, higher-overhead APIs is holding the cards back at launch, that leaves them with a lot of room to potentially improve performance in the future. It's possible that the cards could age a lot better than similarly-priced models from the competition. That is, so long as Intel keeps updating and optimizing the drivers for these first-generation cards for years to come. Even without optimizing much for older games though, as newer titles focus on newer APIs, and game developers begin optimizing specifically for the hardware, the performance situation is likely to improve.

    Like I said previously, Intel tends to price their hardware competitively when entering into new markets, and the big price mark-ups on competing cards leaves them with a lot of flexibility to impress in terms of pricing. Of course, there are still many unknowns, like how raytraced effects will perform, and whether the hardware holds potential for that to improve in the future. And also whether features that have become the norm on AMD and Nvidia cards will be present, and function as expected. There could potentially be quirks that make one think twice about trialing these cards. So I would fully expect Intel to price them competitively if they hope to establish a presence in the market.
    Reply
  • LastStanding
    As soon as Intel (and Apple) releases a GPU that matches, or best, NVIDIA's current best flagship on the market, all the DX9-11 non-supported titles complaints from gamers will not even matter anymore.

    It's not like all these old relic titles' devs/pubs are just going to go back and add support to Intel (nor Apple's M architecture) new infrastructure. 🙄
    Reply
  • LuxZg
    Actually, as soon as performance in older titles is "good enough" it will stop to matter. Those top 5 Steam titles sound harsh, until you realize one is CS GO (350+ FPS with 3070) other is Dota (250+). So for 1440p even "bad" performance could be good enough for most gamers. And even if the issues persist to Battlemage generation, it will be even less pronounced. Sure some will check FPS and benches and complain, but many will just play. Specially in OEM builds, as most "real gamers" (geeks, techies, the vocal crowd) won't even look at OEM builds. But many casual gamers, yeah, they won't care. I just keep wondering about the prices, because as article says, they aren't delivering vs eg 6400. Prices keeping to tumble, and with next gen in a few months, will they price 770 cards at 300$ or less?

    Oh and BTW, that must be a typo in article with sentence comparing 770 to 3070 and 6500 XT (?)
    Reply
  • The Historical Fidelity
    wifiburger said:
    meh, it's going to take Intel forever to optimize those drivers
    That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

    Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

    The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
    And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
    I’ve been an nvidia money giver my entire computing life starting with a GeForce 4400 TI, then FX 5800, twin GT 7800s, GTX 260, GTX 470, twin GTX 680s, GTX 980, GTX 1080. For this generation though, I risked buying an AMD 6900xt and I’ve been super happy with the card and the drivers.
    Reply
  • KyaraM
    So. I read a pretty interesting article about the A380 today in comparison to the GTX 1650. Sadly, it's a German article, so I will essentially just summarize the article and then link it below.

    So what they did was testing both cards in the Tier 1 games, and then OC the A380. In contrast to this article, though, they gave actual performance improvements. Clock speeds improved by a modest 150MHz over stock, power draw from 35 to 55W. All they did was raise the slider "GPU Performance Boost" to 55%. The boost to performance was pretty steep, 37% according to the website.

    a/sxeZCgPView: https://imgur.com/gallery/sxeZCgP

    However, as you can see, the A380 suddenly matches the 1650 quite well with the same settings, while supposedly still being more frugal. If estimated release prices are correct at $175... and if the same can be done with the A770...

    Link to the article
    https://www.notebookcheck.com/Eine-uebertaktete-Intel-Arc-A380-tritt-im-Gaming-Vergleich-gegen-die-Nvidia-GeForce-GTX-1650-an.635575.0.html
    Edit: none of the above is my own stuff, they belong to the authors of the article and the dude who performed the test.
    Reply
  • jp7189
    "temperature limit to 125 degrees Celsius" ?! :oops:
    Reply
  • cyrusfox
    jp7189 said:
    "temperature limit to 125 degrees Celsius" ?! :oops:
    All the coolers we have seen are overbuilt, you would have to starve it of air to get it to reach that damaging temp.

    I really hope we can get a water block on one of the top end cards .I only do water on my GPUs whether they need it or not.
    Reply
  • Eximo
    LuxZg said:
    Oh and BTW, that must be a typo in article with sentence comparing 770 to 3070 and 6500 XT (?)

    The link is correct to the 6700XT review, the text is wrong.
    Reply