Firm Estimates Intel's GPU Unit Losses at $3.5 Billion, Suggests Selling It Off

Intel Arc Alchemist GPUs
(Image credit: Intel)

The head of Jon Peddie Research, a leading graphics market analysis firm that has been around for nearly 40 years, suggests that Intel might axe its Accelerated Computing Systems and Graphics Group (AXG). The division has been bleeding money for years and has failed to deliver a competitive product for any market segment that it serves. Forget the best graphics cards; Intel just needs to ship fully functional GPUs.

$3.5 Billion Lost

Jon Peddie estimates that Intel has invested about $3.5 billion in its discrete GPU development — investments yet to pay off. In fact, Intel's AXG has officially lost $2.1 billion since its formal establishment in Q1 2021. Given the track record of Pat Gelsinger, Intel's chief executive who scrapped six businesses since early 2021, JPR suggests that AXG might be next.

"Gelsinger is not afraid to make tough decisions and kill pet projects if they don't produce — even projects he may personally like," Peddie wrote in a blog post. "[…] The rumor mill has been hinting that the party is over and that AXG would be the next group to be jettisoned. That rumor was denied by Koduri."

When Intel disclosed its plans to develop discrete graphics solutions in 2017, it announced plans to address computing, graphics, media, imaging, and machine intelligence capabilities for client and datacenter applications with its GPUs. As an added bonus, the Core and Visual Computing Group was meant to address emerging edge computing market.

Five years into its discrete GPU journey, the company has released two low-end standalone GPUs addressing cheap PCs and some datacenter applications; launched its low-power graphics architecture for integrated GPUs; delivered oneAPI that could be used to program CPUs, GPUs, FPGAs, and other compute units; cancelled its Xe-HP GPU architecture for datacenter GPUs; postponed (multiple times) shipments of its Ponte Vecchio compute GPU for AI and HPC applications (the most recent was partly due to the late arrival of the Intel 4 node), and delayed the launch of an Xe-HPG ACM-G11 gaming GPU by about a year.

Considering how late to market Intel's Arc Alchemist 500 and 700-series GPUs are already and the fact that they will have to compete against AMD's and Nvidia's next-generation Radeon RX 7000 and GeForce RTX 40-series products, it is highly likely that they will fail. This will obviously increase Intel's losses.

To Axe or Not to Axe

 Given Intel's AXG track record, the company has spent $3.5 billion without any tangible success so far, Jon Peddie asserts. For Intel, discrete GPUs are a completely new market that requires heavy investments, so the losses are not surprising. Meanwhile, Intel's own Habana Gaudi2 deep learning processor shows rather tangible performance advantages over Nvidia's A100 in AI workloads, a market for Intel's Ponte Vecchio. This success might tip the scales toward axing AXG.

"It is a 50–50 guess whether Intel will wind things down and get out," said Peddie. "If they don't, the company is facing years of losses as it tries to punch its way into an unfriendly and unforgiving market."

Strategic Importance of GPUs

While it might make sense for Intel to discharge its AXG group and cancel discrete GPU development to cut down losses, it should be noted that Intel pursues several strategically important directions with its AXG division in general and discrete GPU development in particular. The list of development directions includes the following:

  • AI/DL/ML applications
  • HPC applications
  • Competitive GPU architecture and IP to address client discrete and integrated GPUs as well as custom solutions offered by IFS
  • Datacenter GPUs for rendering and video encoding
  • Edge computing applications with discrete or integrated GPUs
  • Hybrid processing units for AI/ML and HPC applications

Discrete GPU development per se has generated only losses for Intel so far (we wonder how much money the Xe-LP iGPU architecture has earned for Intel after two years on the market), but it should be noted that without a competitive GPU-like architecture that could serve everything from a low-end laptop to a supercomputer, Intel will not be able to address many new growth opportunities.

Habana Gaudi2 looks to be a competitive DL solution, but it cannot be used for supercomputing applications. Moreover, without further evolution of Intel's Xe-HPC datacenter GPU architecture, the company will not be able to build hybrid processing units for AI/ML and HPC applications (e.g., Falcon Shores). Without such XPUs, Intel's ZettaFLOPS by 2027 plan starts to look increasingly unrealistic.

While Intel's discrete GPU endeavor has not lived up to expectations, Intel needs an explicitly parallel compute architecture for loads of upcoming applications. GPUs have proven to be the best architecture for highly parallel workloads, no matter whether they require low compute precision like AI/DL/ML applications or full FP64 precision like supercomputing applications.

If Intel pulls the plug on standalone GPU development, it will have to completely redesign its roadmap both in terms of products and in terms of architectures. For example, it will have to find a provider of a competitive GPU architecture for its client processors, as a small in-house iGPU development team within Intel will hardly be able to deliver an integrated graphics solution that would be competitive against those offered by AMD and Apple for their client system-on-chips (SoCs).

Summary

Intel's discrete GPU endeavor may have already cost Intel about $3.5 billion, has not brought any fruits so far, and will likely generate further losses. Killing the AXG division seems like an increasingly attractive management decision. However, GPUs and derivative hybrid architectures are strategically important for many markets Intel serves and applications that it will have to serve in the following years, so discharging the AXG group seems counterproductive. A lot likely hinges on Intel's graphics driver woes, but fixing the drivers isn't a quick solution.

What will Pat Gelsinger do? Perhaps we will find it out rather sooner than later. "Perhaps the clouds will lift by the end of this quarter," muses Jon Peddie.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • jkhoward
    If Intel ever sells their graphics division I will never buy another one of their products. We absolutely need a third player in the market. The first round never goes well, and I have faith they’ll get it right the second time just like AMD RDNA 2.
    Reply
  • JamesJones44
    It would be a bad move IMO. Datacenter will become more and more GPU/ML centric as ML continues to grow. If they want to get out of the consumer GPU market, that's fine, but I think you might as well close up shop at Intel if you just let the datacenter market go from the GPU side, they have to try to break in if they are going to survive long term.
    Reply
  • Alvar "Miles" Udell
    Intel was the company who said ,"You know, over at AMD, Raja put their GPU division so far behind that Vega and Navi was incomparable to nVidia counterparts...so let's hire him!", and now they're reaping what they sewed.

    If anything they should give him the axe, poach someone proper from nVidia and/or AMD, and get themselves back on track.
    Reply
  • escksu
    Haha... When I first read that Raja was going to Intel, my first thought was wow!! Good luck to Intel. They actually hired him after the mess he created at AMD.

    Today, we still have yet to see any decent discrete GPU from Intel. We yet to see a single Bitcoin miner. Basically nothing... Only delays after delays.

    However, I do not think Intel should axe their graphics division. But they should realign it's focus. CPUs no longer adequate for high end computing and specialized processors like AI and GPUs are needed. So, I would say Intel should just axe the discrete gpu (basically no gaming GPUs) and focus only on the compute units.
    Reply
  • escksu
    jkhoward said:
    If Intel ever sells their graphics division I will never buy another one of their products. We absolutely need a third player in the market. The first round never goes well, and I have faith they’ll get it right the second time just like AMD RDNA 2.

    I don't think they will sell their CPU division. I think they will axe the discrete gpu unit instead and focus on hpc only. At their current stage, it really makes no sense to compete with AMD or Nvidia in gaming GPU.
    Reply
  • sweepyjoelschl
    escksu said:
    Haha... When I first read that Raja was going to Intel, my first thought was wow!! Good luck to Intel. They actually hired him after the mess he created at AMD.

    Today, we still have yet to see any decent discrete GPU from Intel. We yet to see a single Bitcoin miner. Basically nothing... Only delays after delays.

    However, I do not think Intel should axe their graphics division. But they should realign it's focus. CPUs no longer adequate for high end computing and specialized processors like AI and GPUs are needed. So, I would say Intel should just axe the discrete gpu (basically no gaming GPUs) and focus only on the compute units.
    Exactly. That guy was always a lot of talk and bluster but never produced anything much worthwhile. I will have completely lost faith in Gelsinger if he throws in the towel on yet another endeavor. First Optane is axed and now possibly this. The KEY is getting the right people running these divisions instead of those that are all talk like Raj. AMD got rid of him for a reason. I am still unsure if Gelsinger is just talk and bluster. On paper he seems to have the technical chops but he has PRODUCED ZERO RESULTS. Plus foolish Intel paid him a boatload UP FRONT instead of making it contingent upon success. Meanwhile, the brilliant Dr. Lisa Su has hired the proper people and is executing to perfection.
    Reply
  • blppt
    Alvar Miles Udell said:
    Intel was the company who said ,"You know, over at AMD, Raja put their GPU division so far behind that Vega and Navi was incomparable to nVidia counterparts...so let's hire him!", and now they're reaping what they sewed.

    If anything they should give him the axe, poach someone proper from nVidia and/or AMD, and get themselves back on track.

    I mean, we don't know what kind of restraints AMD put on his team considering that was well before Zen saved their bacon, but yeah, it seemed kinda odd that they wouldn't try poaching Nvidia's guys given the success they were having at the time.

    Nvidia's last dud was probably the GTX480---that's a long time ago.
    Reply
  • InvalidError
    Last I looked, Intel IGPs accounted for over 60% of all PC graphics. Since its discrete GPUs use the same architecture as its Xe IGPs, Intel's graphics division isn't going anywhere. And then you have the whole AI/ML/HPC/etc. being the largest big-money growth markets with steady future prospects, Intel cannot really pass on that.
    Reply
  • bit_user
    sweepyjoelschl said:
    Exactly. That guy was always a lot of talk and bluster but never produced anything much worthwhile.
    I used to think this, until RDNA and RDNA2. Their success made me wonder how much influence he had in them and if I judged him too harshly. Unless there was a separate team behind RDNA that was operating in parallel, then Raja definitely had a hand in it.

    sweepyjoelschl said:
    I will have completely lost faith in Gelsinger if he throws in the towel on yet another endeavor. First Optane is axed and now possibly this.
    As sad as I was to see Optane go, I honestly don't know if the physics allow it ever to be cost-competitive. It was probably conceived in a world of SLC and MLC planar flash, but now it has to compete against TLC and QLC with hundreds of layers. There could be physical limitations which prevent 3D XPoint from ever being competitive in terms of cost per bit. And we know it doesn't have the endurance to serve as a cheap DRAM alternative, contrary to their early messaging.

    sweepyjoelschl said:
    On paper he seems to have the technical chops but he has PRODUCED ZERO RESULTS.
    Not sure about that. Everything Intel launched or is struggling with pre-dates him. We won't hear about projects originated or incubated under him for a couple more years.

    One thing we do know he's done is IFS. And that seems like an idea whose time has come.

    sweepyjoelschl said:
    Meanwhile, the brilliant Dr. Lisa Su has hired the proper people and is executing to perfection.
    She's indeed doing a terrific job, but she incorrectly gets credit for Zen which actually originated under her predacessor.
    Reply
  • bit_user
    InvalidError said:
    Last I looked, Intel IGPs accounted for over 60% of all PC graphics. Since its discrete GPUs use the same architecture as its Xe IGPs, Intel's graphics division isn't going anywhere.
    Exactly. A lot of the HW & software IP is shared across these product lines. Intel needs to maintain competitiveness in their iGPU solutions, especially to fend off competition from Apple & Nvidia-equipped MediaTek SoCs.

    InvalidError said:
    And then you have the whole AI/ML/HPC/etc. being the largest big-money growth markets with steady future prospects, Intel cannot really pass on that.
    Partly agree. In terms of AI/ML, they have Habanna Labs, which seems to be executing well. It does those things better than GPUs, meaning they don't need to stay in the GPU market for those reasons.

    However, Intel cannot be credible in the HPC space without a GPU-like accelerator. So, they really need to decide whether they want to stay in that market.

    Finally, it's not clear how much AMX has contributed to their delays in Sapphire Rapids. If they had instead leaned on Habanna and Arctic Sound for that functionality, maybe it would've launched by now and they wouldn't be hemorrhaging server marketshare as badly.
    Reply