This year, Intel made a noteworthy entry into the low-to-mid range discrete GPU market with its A750 and A770 cards, but don't expect the company to start challenging the RTX 4090 or Radeon RX 7900, both of which require a lot of wattage, any time soon. According to Intel Graphics Head Raja Koduri, the company is squarely focused on squeezing as much performance as it can out of a 200 to 225W power limit -- often with a single power connector -- and hopefully enough to compete with the best graphics cards.
"My priority at this point is getting that core audience, with one power connector," Intel Graphics Head Raja Koduri said in an interview with Gadgets360, an Indian tech site. "And that can get you up to 200W – 225W. If you nail that, and something a little above and a little below, all that falls into the sweet spot."
Modern high-end graphics cards consume incredible amounts of power as companies like AMD and Nvidia tend to squeeze every bit of performance out of their GPUs. This naturally increases the prices of those advanced graphics cards to levels not accessible by many gamers. However, Intel, which just released its first mainstream discrete GPUs this year, is apparently focused more on power efficiency and affordability than raw speeds. And Raja believes that loads of consumers just want something that's affordable and doesn't require a huge power supply and a ton of cooling
The company's recently-released Arc A750 and A770 cards, priced at between $289 and $349, fall somewhere between the RTX 3060 and RTX 3060 Ti, both of which cost more, on the GPU benchmark hierarchy. Intel's cards already fall into the 200 to 225W power range, though for what it's worth, they still have two power connectors where Nvidia is able to have a single connector on some of its cards.
Now, while a plan to build a high-end graphics board that would consume around 200W may sound like an unachievable dream given today's standards for gaming-grade graphics cards (maximum performance at whatever cost), it perhaps should be considered from a GPU architecture point of view. After all, Raja Koduri is a GPU architect rather than an engineer who implements those graphics processors in silicon or another engineer who figures out how to build a graphics cards to make a GPU run at its max.
Building a discrete GPU architecture — that would provide decent performance in say a 4K resolution in a circa 200W power envelope — would be an achievement by itself. Nvidia and AMD cards that can play smoothly at 4K use a ton of power. Building a GPU in a set power envelope is another challenge and succeeding would be an undisputed achievement.
If this sweet spot architecture scales both up and down in terms of power, then it is possible to build something considerably more powerful or less power hungry. In the former case, Intel would compete against mighty GPUs from AMD and Nvidia. Whether such an architecture is part of the public Intel Arc roadmap is something that remains to be seen, but at least Raja Koduri expresses such a goal.
Addressing mass market buyers is perhaps Intel's course of action for its Arc discrete GPUs for now as the company is only just entering the standalone GPU market and yet has to gain market share. To that end, its main goals at this point (probably) are to make GPUs that provide good performance for notebooks (where Intel has an indisputable lead on the CPU side of things) at low power as well as desktops aimed at mainstream gamers who do not tend to spend $1000 per graphics card. This is a business goal though.
In the meantime, Koduri expresses confidence of Intel's Arc roadmap in general and the next generation of the company's standalone GPUs codenamed Battlemage due in 2023 in particular. "The interest level is very, very high," the Intel graphics and accelerated computing boss said. "And [we're working on] landing more partners in India who can ship good volumes here at good price points. So, expect to see a lot more Arc in 2023 and more variations of Arc."
So far, Intel is lagging on GPU perf/W. They're not going to magically leapfrog the efficiency of AMD and Nvidia in a single generation (if ever).
That's interesting. I wonder if it's a response to any pushback they've gotten from Taiwanese board partners. Or, maybe they just want to diversify their supply chain.
So, what are some Indian GPU card makers?
So far these dedicated cards are just like their integrated ones; 10 years behind. And the drivers? Embarrassing.
Less talk, more do.
Arc cards don't appear to be 10 years behind. Close to parity in the Low-Mid market.
I suspect that Intel is focused on OEM partners who want to sell basic gaming boxes at a basic price. No drama and low RMA rates. The whole system costing less than an 4070 and available off the shelf as an impulse purchase at a big box store.
Core i5 (OEM cooler) plus Arc 770 on an H-chipset with an easy to reach power target for a low-end PSU. That's a real sweet spot for an integrated system provider.
You're also fine to post qualifications, corrections, disagreement, etc. but such direct attacks are out-of-bounds.
As Nvidia struggles with "Bad Grandpa" Jensen's overwhelming greed and diminishing sanity, Intel is smart enough to understand that its hard to sell a 4 slot computer part that's physically incapable of fitting inside most computers. Forget about even trying to find a way to power that toaster.
Bonus points if if Intel makes a card that can run in a HP Peice of Crap™ without the whole system catching fire. Although, HP and Dell have gotten so downmarket and proprietary that I'd be surprised if they even have a real PCIe slot or single spare power connector.
If you bought a Dell or HP that didn't already have a PCIe power cable, then the PSU is probably too small and has no extra connectors for one. I think it's common for people with Dells to upgrade the PSU to one from an AlienWare, when they want to install a beefier graphics card.