Sign in with
Sign up | Sign in
Your question

Die Fabrication

Tags:
Last response: in CPUs
Share
May 4, 2010 6:01:33 PM

Why are CPU & GPU manufacturers shrinking their die fabrication process in steps like 65nm to 55,55 to 45nm etc rather than from 65nm to say 1 or 2nm? Ultimately if they are gonna shrink it,why don't they shrink to large extents? Is there any practical difficulties? Expert opinions awaited...

More about : die fabrication

a c 131 à CPUs
May 4, 2010 6:52:24 PM

So basically you are asking why technology, and specifically die sizes, does not progress faster?
m
0
l
a b à CPUs
May 4, 2010 7:16:17 PM

I'm no expert on the technical side of things but I'm guessing it's not that simple trying to fit more and more features etc. on a smaller die.

That being said, I'm sure (from a cynical point of view at least) it has a lot to do with money. By reducing things in steps instead of one huge leap they are forcing consumers to purchase everything in between. They know how enthusiastic people can be when it comes to computer technology, hence they are ensuring themselves strong prospects for future business. Simply put, why show your hand straight away when you can make money by delaying the process.

Similarly, I've often heard that technology is available to manufacture light bulbs that will never fail. Although in reality, no manufacturer will ever create this technology since nobody would need to buy light bulbs again hence they are immediately putting themselves out of business. Its the same principle.

In reality it's probably a combination of both of the above points.

m
0
l
Related resources
May 4, 2010 8:51:38 PM

Technology can't just advance like that, engineers need a ton of research to reduce transistor sizes, and make them work. I don't think we have any technology to even try to make any transistors at 2/1nm.
m
0
l
a b à CPUs
May 4, 2010 9:24:21 PM

My understanding is that these points are established by the International Technology Roadmap for Semiconductors (ITRS).

Like every industry, they learned that some sort of standard structure is beneficial for long term research and development.

So yes, somebody could just decide to do a 35nm node, but there is a lot more industry support if you stay on the roadmap.
m
0
l
May 7, 2010 4:35:29 AM

Well,i feel that the manufacturers are using the same basic technology to reduce the fabrication process.Otherwise,those products will have compatibility issues with the current generation hardware.I feel this weird coz the manufacturers itself are the beneficiary ones of using a smaller fabrication technology than the customers.They could reduce the cost per die and enables them to use cheaper heat sinks since a smaller die will generate less heat.Customers will also be benefited with lower power consumption.
m
0
l
a b à CPUs
May 7, 2010 5:12:20 AM

They probably could, if you were OK with paying $800,000 for a CPU. It's not about what's the smallest transistor that's scientifically possible to produce; it's about what's the smallest that's possible to MASS produce and get working for an affordable price. There is a huge difference between what you can make one at a time in a lab under an electron microscope, and what you can hook up a machine to make millions of with a low failure rate.

Also ... let's say they did come out with a CPU using 2nm technology tomorrow. What would you do with it - play Crysis at 1.5 million FPS? There's no program written, and none anywhere near being written (at least in a mass-market sense) that would be even close to making use of that kind of power.

In other words, going by huge leaps and bounds is not only difficult and expensive, but it's pointless unless the other technology that's tied to it has time to improve also.
m
0
l
a c 133 à CPUs
May 7, 2010 5:16:02 AM

ramx said:
Well,i feel that the manufacturers are using the same basic technology to reduce the fabrication process.Otherwise,those products will have compatibility issues with the current generation hardware.I feel this weird coz the manufacturers itself are the beneficiary ones of using a smaller fabrication technology than the customers.They could reduce the cost per die and enables them to use cheaper heat sinks since a smaller die will generate less heat.Customers will also be benefited with lower power consumption.

For each time they shrink the die the manufacturing process changes which in turn means new equipment which is not cheap so you are wrong about it costing less to make these die shrinks engineers first have to make the equipment to produce the product which gets past on to the consumer.
m
0
l
a c 172 à CPUs
May 7, 2010 11:36:57 AM

ramx said:
They could reduce the cost per die and enables them to use cheaper heat sinks since a smaller die will generate less heat.

We saw this happen recently. The size of the stock Intel heatsink supplied with the 65 nm C2 CPU's is about twice as large as the one supplied with the 45 nm CPU's.
m
0
l
May 7, 2010 6:39:01 PM

I think that the engineers could visualize a few steps ahead and design a much smaller die.If they need new equipments for even reducing the fabrication process in small steps,they could probably modify the equipments to achieve a higher level of die shrinkage.In case of performance,we have no upper limit.Nobody complains of higher performance.
m
0
l
May 7, 2010 10:53:24 PM

capt_taco said:

Also ... let's say they did come out with a CPU using 2nm technology tomorrow. What would you do with it - play Crysis at 1.5 million FPS? There's no program written, and none anywhere near being written (at least in a mass-market sense) that would be even close to making use of that kind of power.



I think this is false, just because you reduce the process does not mean you automatically get extra performance.

Intel could make a Pentium 4 at 2nm eventually and change nothing about it other than the process and it wouldn't really be any faster than the Pentium 4s they had years back.

Your familiar with the concept of supercomputers? The kinds of scientific computations they must do and also the kinds of work server farms must endure would easily tax out a proper futuristic 2nm cpu.
m
0
l
May 8, 2010 12:09:19 AM

protokiller said:
I think this is false, just because you reduce the process does not mean you automatically get extra performance.

Intel could make a Pentium 4 at 2nm eventually and change nothing about it other than the process and it wouldn't really be any faster than the Pentium 4s they had years back.

Your familiar with the concept of supercomputers? The kinds of scientific computations they must do and also the kinds of work server farms must endure would easily tax out a proper futuristic 2nm cpu.

They could ramp up the clock frequency though. Finally, they can hit 10 GHz like planned :D 
m
0
l
May 8, 2010 12:44:46 AM

uncfan_2563 said:
They could ramp up the clock frequency though. Finally, they can hit 10 GHz like planned :D 



Yes they probably could hit 10GHz and add 20mb+ of l2 cache!

My point in response to Capt. Taco was just that even if we had cpus that were 2000X as powerful as the ones we have today, we could max them out very easily doing things like folding or putting them in complex servers.

m
0
l
May 8, 2010 1:23:46 AM

They don't skip because it's incredibly hard to maintain their current pace, let alone just skip a whole bunch in between. Every new step, new technologies and processes have to be used. A new process is incredibly complex, and poses massive hurdles that have to be overcome.

Point is, they don't skip very far because they *can't*. Even if they wanted to throw all their might at going from 32nm to 12nm, it would take many, many years, anyway.
m
0
l
a b à CPUs
May 9, 2010 1:48:16 AM

protokiller said:
Yes they probably could hit 10GHz and add 20mb+ of l2 cache!

My point in response to Capt. Taco was just that even if we had cpus that were 2000X as powerful as the ones we have today, we could max them out very easily doing things like folding or putting them in complex servers.


Right, that's why I said there's no mass market use for them. I'm sure there are plenty of ways to use it, but just none that millions of people are going to want. That's why you don't see many people buying supercomputers.

I'm also assuming that if they went to something like 2nm technology, at the very least they'd also be putting thousands of times more transistors on the chip for a corresponding performance increase. Otherwise, that would be a big waste of time.
m
0
l
May 9, 2010 1:51:19 AM

capt_taco said:
Right, that's why I said there's no mass market use for them. I'm sure there are plenty of ways to use it, but just none that millions of people are going to want. That's why you don't see many people buying supercomputers.

I'm also assuming that if they went to something like 2nm technology, at the very least they'd also be putting thousands of times more transistors on the chip for a corresponding performance increase. Otherwise, that would be a big waste of time.



Heh I think there isn't a mass market need for most of the processors out there when you consider the average home user just browses the web and checks email while watching the occasional youtube video. But that's a different subject entirely.
m
0
l
a c 133 à CPUs
May 9, 2010 1:52:54 AM

For most users a core 2 processor is plenty of power and have no need for newer chips.
m
0
l
May 9, 2010 1:58:00 AM

saaiello said:
For most users a core 2 processor is plenty of power and have no need for newer chips.



Yep, hell I'd even say a Pentium 4 or newer Celeron is more than enough for most people.

I remember how mad I was a few years back when my high school bought a bunch of new comps with Core 2 Duos in them..... just for taking attendance, that's right, there life would be spent checking bubbles for the tardy or absent kids. What a waste of a CPU that could have done so much more.
m
0
l
a b à CPUs
May 9, 2010 7:52:35 AM

It's technologically infeasible. You can't guarantee that existing materials can even be used to make a microchip at such a process node as 1-2nm, and you can't magically pick materials that will work out of the air. It would be like asking why Henry Ford didn't make cars that could do 0-100kph in 0.5 seconds.

R&D looks ahead several process nodes. You'll find that most semiconductor companies are working on 16nm prototypes, or at least paper designs, but probably have nothing even on paper for anything smaller.

The second most important (and essentially irrelevant due to the previous) reason why this won't happen is due to cost. Companies try to follow Moore's Law because it makes them the most money. Moore's Law states that the number of transistors that can be placed on an integrated circuit at the lowest cost per transistor doubles approximately every two years. You could certainly more than double the number every two years but doing so would mean higher cost per transistor, and the same is true if you do less than double.
m
0
l
!