Intel reportedly plans to launch Arc 'Battlemage' GPUs before the holidays — Second-gen Arc prepares for takeoff this fall
Intel's Battlemage should arrive in time for the shopping spree.
The roll out of Intel's first generation Arc Alchemist GPUs back in 2022 was a slow process that took several quarters, which ultimately hurt sales and adoption rates and relegating them to the budget end of the best graphics cards. With its second generation Arc Battlemage products, the company doesn't want to repeat the same mistake and intends to introduce them this year ahead of the critical holiday season. This is according to a report from ComputerBase, citing sources at Embedded World.
Intel aims to launch its second generation Arc 'Battlemage' graphics processors this fall, ideally capturing the holiday sales surge that kicks off around Black Friday, with plans for a release by November at the latest, according to the report. This timing would let Intel capitalize on the lucrative holiday shopping season, a critical sales period in the retail calendar, especially in the U.S.
The timeline suggests that Intel's Battlemage launch should occur by November at the latest in a bid to let Intel's add-in-board partners to have enough time to ramp up production of their graphics cards and ship actual products to retailers worldwide. However, the initial launch could occur earlier — exact details are not yet fully nailed down.
Intel is in the process of bringing up two new Battlemage discrete graphics processing units, known as Battlemage-G10 and Battlemage-G21, based on recent leaks. The naming convention suggests that Battlemage-G10 will be the more powerful of the two, targeting the midrange market, while Battlemage-G21 will cater to entry-level systems that still need a standalone graphics processor.
Neither Battlemage-G10 nor Battlemage-G21 GPUs are finalized yet, as G21 is rumored to be in the pre-qualification stage. This stage involves testing the chip's functionality, reliability, and performance, but it does not guarantee readiness for mass production. However, if these pre-qualification tests are successful and the GPUs meet the necessary criteria for performance, power, and yields, they could move forward to mass production. It's unclear whether Battlemage-G10 is at a similar state or if it's further out — note that the previous Alchemist generation launched the smaller ACM-G11 first, in the Arc A380.
Intel's first generation Arc Alchemist only barely managed to compete against mainstream GPUs from AMD and Nvidia — the Arc A750 matched up decently against the RTX 3060 12GB and RX 6600 XT, both of which were over a year old at the time. In fact, Nvidia and AMD were in the midst of launching their newer generation parts that would widen the performance deficit. This was all largely thanks to Alchemist being delayed by several quarters due to driver readiness. With Battlemage, Intel will have more or less stable drivers and an established driver development team, so the new family of chips won't face the same challenges.
What remains to be seen is what market segments will Intel be able to address with its Battlemage GPUs. Intel potentially faces a similar problem as last time, as potentially Nvidia's Blackwell-based RTX 50-series and AMD's RDNA 4-powered RX 8000-series could also start showing up before the end of the year. Battlemage should deliver plenty of improvements, and hopefully Intel will be better prepared for the fight against AMD and Nvidia.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
-
hotaru251 I seen their driver issues are much better than launchso I "might" give these a try assuming they can be aroudn a 4060/4070 as I do need to replace a machines 1060 soemtime and really dont want to give nvidia more $.Reply -
Crazyy8 If they can get better performance than my RTX 4060 TI 16GB for more than $100-$150 less, I'm sold.Reply -
Eximo If the original rumors are true and the top card can do 4070 like performance but with 16GB memory, it will be quite comparable to the 7800XT which is $500.Reply
A770 launched at $350, but they knew they were in a bad position. So $500 launch price would be somewhat anemic in terms of pricing, but also maybe not possible given the size the B770 chip may have to be. Even with a node shrink to match Nvidia and AMD it may be cost prohibitive to sell it for less than like $600.
7800XT 346 mm TSMC N5/N6
A770 406 mm TSMC N6
4070 294.5 mm TSMC N4 -
thisisaname
If it matches the 4070 and it sold for $600 there is little reason not to buy a 4070 other than it is not Nvidia?Eximo said:If the original rumors are true and the top card can do 4070 like performance but with 16GB memory, it will be quite comparable to the 7800XT which is $500.
A770 launched at $350, but they knew they were in a bad position. So $500 launch price would be somewhat anemic in terms of pricing, but also maybe not possible given the size the B770 chip may have to be. Even with a node shrink to match Nvidia and AMD it may be cost prohibitive to sell it for less than like $600.
7800XT 346 mm TSMC N5/N6
A770 406 mm TSMC N6
4070 294.5 mm TSMC N4 -
aberkae
Intel's software in drivers and Xess is improving at a faster rate than AMD's. So there is hope for them especially when AMD is playing stagnation love game with Nvidia. AMD is fortifying its position in the midrange with rdna 4 and PlayStation 5pro ( rumored). While Nvidia will likely to swell gpu pricing even further with Blackwell. The later part of gpu swelling in prices will benefit them with rdna 5 by fillingin the gap.hotaru251 said:I seen their driver issues are much better than launchso I "might" give these a try assuming they can be aroudn a 4060/4070 as I do need to replace a machines 1060 soemtime and really dont want to give nvidia more $. -
Eximo
That is why the size of the chip matters a lot. If it is anywhere close to the size of the 4070 then it will cost roughly the same to produce. 4070 has already dropped to around $540.thisisaname said:If it matches the 4070 and it sold for $600 there is little reason not to buy a 4070 other than it is not Nvidia?
Right now the A770 cost a lot to make and was sold at probably barely a profit if not a loss.
For the 4070 that is roughly a 300mm squared. A 300mm TSMC wafer would pump out 942 chips at 100% yields. TSMC made recent claims to aim for 80% yields. Samsung is noted as saying they currently do about 60% yield on their best node. TSMC is said to be charging around $20,000 for a wafer (up to as much as $25,000). So about $36 a chip with 75%.
Nvidia is known for about a 60% profit margin overall so we can assume they sell the GPUs at roughly $60-100, 16GB GDDR6 memory is about $55. Then we need a PCB and Heatsink, assembly, packaging, marketing and shipping, plus the retailed profit margin. So call it maybe $300 minimum to get one to a store. Which accords somewhat well with the A770 sold at zero profit or a small loss being a bigger chip on a slightly older node.
So that $400 price point is possible, but then you have to ask what is in it for Intel, since they aren't a huge part of this profit chain, only the mark up they have to pass along to their board partners. Basically, they have to start turning a profit at some point. Maybe they can afford to battle at the low end another generation, but low prices only work with wide adoption. -
jlake3 Hmmm... I'll believe it when I see it.Reply
Machine-translating the linked article, they say "'Before Black Friday' is the goal", and Intel's official line is "new series would be 'hopefully' before the CES 2025 and thus in 2024" (Emphasis theirs). That's more of a noncommittal aspiration than a roadmap.
Neither Battlemage-G10 nor Battlemage-G21 GPUs are finalized yet, as G21 is rumored to be in the pre-qualification stage. This stage involves testing the chip's functionality, reliability, and performance, but it does not guarantee readiness for mass production.
Given that the silicon is in pre-qualification and another rumor I'd heard was that as of a month or two ago AIBs haven't been briefed at all on what to expect with Battlemage, it seems unlikely there's going to be a high-volume retail launch by Black Friday. To get chip design completed and qualified, boards designed, boards validated, tooling set up, chips fabricated, chips sent to board partners, boards assembled, finished cards shipped by boat, and distributed to retailers in that time seems like it's going to need everything to go perfect.
Might be a paper launch, might be very low volume of air-shipped cards, might be reference design only, or might be more than one of the above, but I would be really wary of something being off with a launch before CES. -
why_wolf I guess will see. But until Intel can start spiting out GPU chips at their own fabs instead of being reliant on TSMC then they are stuck with the same economic factors that AMD & Nvidia face. So not as much room to offer lower prices other than taking smaller margins, which Intel does not like. They want those huge Nvidia margins.Reply
But if they release new cards and the pricing structure just follows what AMD does (marginally cheaper than the equivalent Nvidia card) than sales will be anemic unless AMD leaves a big hole in the market for Intel to scoot through. Nvidia of course will be sailing high with some new xx90 tier card at the opening so not direct competition to the lower tier cards Intel is making.
All this assumes of course that Battlemage can achieve at least 4xxx series performance without being some 1000watt factory overclocked monstrosity. -
thisisaname
Nicely put, any idea on how much Intel would charge itself if it could make it's own chips?Eximo said:That is why the size of the chip matters a lot. If it is anywhere close to the size of the 4070 then it will cost roughly the same to produce. 4070 has already dropped to around $540.
Right now the A770 cost a lot to make and was sold at probably barely a profit if not a loss.
For the 4070 that is roughly a 300mm squared. A 300mm TSMC wafer would pump out 942 chips at 100% yields. TSMC made recent claims to aim for 80% yields. Samsung is noted as saying they currently do about 60% yield on their best node. TSMC is said to be charging around $20,000 for a wafer (up to as much as $25,000). So about $36 a chip with 75%.
Nvidia is known for about a 60% profit margin overall so we can assume they sell the GPUs at roughly $60-100, 16GB GDDR6 memory is about $55. Then we need a PCB and Heatsink, assembly, packaging, marketing and shipping, plus the retailed profit margin. So call it maybe $300 minimum to get one to a store. Which accords somewhat well with the A770 sold at zero profit or a small loss being a bigger chip on a slightly older node.
So that $400 price point is possible, but then you have to ask what is in it for Intel, since they aren't a huge part of this profit chain, only the mark up they have to pass along to their board partners. Basically, they have to start turning a profit at some point. Maybe they can afford to battle at the low end another generation, but low prices only work with wide adoption.
The mid to low end is a bad place to make profits, they need to be able to compete at the 4090 ( or better yet Nvidia's next generation top end card) level to pick up some sweet sweet profits. -
Ogotai
of course they are, the drivers still have so much more to improve then amd or nvidia's :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO: :ROFLMAO:aberkae said:Intel's software in drivers and Xess is improving at a faster rate than AMD's
just like with arc, the drivers will make or break battlemage.