40% of the Haswell die is wasted on on-board video, wth?

VarmintCong

Honorable
Jun 1, 2013
12
0
10,510
Looking at the Haswell die, 35% of its area is onboard graphics, maybe more, since other parts of the die might be shrunk without video.

So Intel could make a graphics-free processor for at least 35% lower cost, probably more, since a smaller die would increase yield and utilization of the wafer edges.

So let's say a graphics-free CPU would be 45% cheaper. An i5-4570K-P would be $110!

Why don't they do this? I assume cause they don't have to, since AMD is dropping the ball.
 
Jan 5, 2013
450
0
10,860
The only problem with that is that they would have to create a whole new manufacturing line for those CPUs which would cost a lot of money and they obviously think it wouldn't be worth the effort and money to do so, honestly it would only be to cater to us gamers which is a very small minority of the market as integrated graphics are used by the general public (laptops, low end desktops) and they are perfectly fine with that graphic power
 

ihog

Distinguished


But who would buy an i5 4670K or an i7 4770K to only use the IGP?
 

USAFRet

Titan
Moderator


Lots and lots of people. Haswell is not some uber-superior CPU. It is simply the new/next mainstream. And as such will become the mainstream at BestBuy and TigerDirect. Buyers there will use just the onboard graphics.
 
Jan 5, 2013
450
0
10,860


I entirely agree about the K series, but the people who buy them are still a minority and therefore the initial startup cost to Intel would probably outweigh the benefit. But yes I agree with the fact it would be awesome if they did do this, sadly from a business standpoint it's probably not worth it
 


I would for a HTPC without a discrete GPU to save power and have decent gaming performance.

As for the OP, Haswell. LGA1155 is dead. 1150 will be HW and BW then new socket.
 

VarmintCong

Honorable
Jun 1, 2013
12
0
10,510


It wouldn't cost a thing, because every circuit needed is already in the die. So they wouldn't have to change their processes at all until the packaging stage. Besides, they sell dual core versions of IB, they even sell video-disabled versions.

So it's clearly not a cost issue, I think it's simply that a $100 quad core 3570K or 4570K would cut into the sales of the chip with video. Desktop buyers would just buy that $100 CPU and buy a $100 video card instead.
 

VarmintCong

Honorable
Jun 1, 2013
12
0
10,510


The industry sold 60 million discrete video cards last year. I'm guessing a good percentage of those were paired with Intel CPUs.
 

USAFRet

Titan
Moderator
So it's clearly not a cost issue,

No, it is clearly a cost/profit issue. Intel has determined that it is not worth it to develop and sell a graphics-free version.
If it were profitable, they would probably do so.

They didn't get to be big by being stupid. You may not agree with their decisions, but they are 'big'.
 

VarmintCong

Honorable
Jun 1, 2013
12
0
10,510


I didn't say profit, I said cost. They could make a video-free chip for half the cost, and sell it for say, $20 less, making even more profit than the current chip.

So why not do this? If they made a die half the size, and still charged $200 for it, the US government might accuse them of being a monopoly. This way, they can just say the chip is expensive to make cause it's so big, which is true.

It could also be that because Intel has massive fab overcapacity, they don't save any money by making smaller chips.
 

VarmintCong

Honorable
Jun 1, 2013
12
0
10,510


There are dozens of processes: steppers, etchers, deposition, plating, CMP, clean etc. The only ones that would need adjustment would be the glass plates for the photo tool, to have the new die with graphics cropped off. Everything would run the same as the current chip. Packaging would of course be different. The design engineers wouldn't have to do anything, no redesign required.

It's trivial compared to say, making a dual core version of Haswell, and a tiny fraction of the cost of a die shrink. And it's a one-time cost, since you could run the wafers on all the same tools with the same recipes, except for the stepper (photo tool).

What really blows your theory is that Intel sold a P version of IB, which had the full die but graphics disabled. So no cost savings there due to a 40% smaller die, yet they decided to sell a separate CPU anyway.
 

Snowsniper2

Honorable
Feb 27, 2012
226
0
10,710


People who don't wish to game on their machine or want low power consumption?


 

8350rocks

Distinguished


It's quite a bit more complicated than that...and the 3350p still had on board graphics, they were just disabled. That doesn't mean they didn't have them on there.
 

Pherule

Distinguished
Aug 26, 2010
591
0
19,010
I have an i5 2500K. I also have a dedicated graphics card. I'm very glad my CPU has on-die graphics because it's a semi-decent fallback should the dedicated card fail.

This argument is stupid. It's well worth paying extra IMO for the on-die graphics.
 


No, not really. The would need to add a separate production line (assuming no excess capacity) which would be very expensive to setup. Initial costs would be over $1 billion for the investment in machinery, and the plant.

If Intel actually has a excess capacity, then it would take time and money for re-tooling. It would likely take several million dollars to retool the machinery. Additionally, to shrink the processor down due to the removal of the graphic core does take a bit of time and money as well. Lastly, you would need to hire people to operate / supervise the production process.

That would be all for a CPU which lacks a graphics core to appeal to a very small fraction of their customer base. OEMs would not want to buy it because they want to cut costs. The inclusion of a graphics core means they can build a PC or laptop without a discrete graphics card which will save them money. For gaming oriented rigs they would only have to install a graphics card and pass the cost (plus a profit margin) to the customer who is looking for a gaming rig.

OEMs represents the biggest client base for Intel. Hardware enthusiast represents the smallest client base. While the actual cost in material and time to manufacture a smaller CPU (i.e. no iGPU), that does not take into consideration the cost of adding a new production line and equipment or re-tooling existing equipment for the new CPU derivative. A small production run means those cost can only be spread out amongst fewer CPUs. That means the price of each CPU will be inflated because of the cost Intel would need to recover even if they wanted to just break even.

For simplicity sake, let's just say for the 1st year it costs $150 million (direct expenses) to re-tool the machines, redesign the CPU to remove the iGPU, and operate the production line. Let's say there are 3 million enthusiast (much smaller than OEMs) who are interested in buying this "iGPU-less" CPU. That means each CPU carries $50 of direct costs that Intel will include as part of the selling price for each CPU. This does not include indirect expenses (examples: administrative costs, warehousing, distribution / selling, etc.). Indirect costs are not insignificant and remember to tack on say a 25% - 35% profit margin for Intel. All this will add to the overall cost of the CPU.

Additionally, for a company the size of Intel, do you think they would bother with a 3 million unit production run per year?