Sign in with
Sign up | Sign in
Your question

Future of die shrinks- 130nm ---> ???? processor size

Last response: in CPUs
Share
October 13, 2007 1:00:16 PM

I was just wondering something, back in the day processor sizes were 130nm or something of that order, and now intel are getting ready their 45nm processors, eventually what happens?
will it just plateau out at 15nm for example?


i understand each time they drop a size, they can increase production from one wafer, and reduce heat production, but im guessing they need smaller transistors

thanks and sorry if it is very basic
October 13, 2007 2:22:55 PM

I am guessing that 12-15nm is the MAX die shrink they will pursue on silicon. Any added transistor count at that point will be done by "glueing" dies together the way Intel did with it's Quad to improve yields. When the thermals get too high with this method, they will move to multiple sockets with their own HSFs the way servers are set up. When all of this is "outgrown" games will probably be on multiple HD-DVDs offering "almost real" looking game environments, a direction Crytek seems to be heading.

I am guessing that once games hit "movie quality" people will be less inclined to upgrade as much, tapering high-end demand, as systems today are WAY more than enough for office use NOW, which is the largest market for computers anyway.

The only growth in semiconductors will be entertainment based in a market of any scale that can sustain AMD and Intel with their current business model, which I don't see changing soon.

To put it simply, when you go and buy a DVD burner now, do you even pay attention to how fast it burns? My burner can burn 8x and I always set it manually to 4 because I want it to burn properly. So if they released a 200x burner with 8 optical lasers reading multiple tracks etc. etc. they don't really have my interest unless I can get it for 45$. I struggle to fill a 45 cent DVD with data, so I'd say they are facing stagnation.

Microsoft struggles with bloatware to keep the industry going at it's current pace, and everyone knows this. RAM fabs were drooling at the prospect of everyone needing 4GB of DDR2, but the consumer found XP with 1GB of RAM is more than enough for what they "need". As a result, RAM companies have excess inventory.

The greatest challenge in this industry is finding talent that can "use" this hardware by developing games and software that people perceive a need.

So, if you wanted to you can mathematically calculate backwards to find the stagnation point for computer hardware.

Take screen size, in relation to the average home, that a consumer would buy, mainstream. Double the size, for good measure.

Quadruple the resolution of HD-DVD today, just for good measure as well.

Reasonably figure out the textures that would be required to "trick" the eye at 10 feet, at the above resolution, that it was a movie.

At this point you can figure out bandwidth, processing required to push all those pixels, etc.

There are hundreds of people on this site who can calculate this at their desk in a few minutes and come out with very precise figures. This would NOT be me, I am still, by most accounts, and in my own opinion, a noob.

Now here is my point, an it is the "clincher". I just downloaded a crappy 750MB demo for COD4 and at the last part of the demo, I started to "sense" that my brain was interpereting it as movie quality. And, don't get me wrong, COD4 is by NO MEANS movie quality LOL. The lighting and filtering engines game developers are using are getting very good.

Now, COD4 is no where as detailed as Crysis. I saw a trailer where it looked like a Rambo movie, where his gun barrel WAS PUSHING VEGETATION AROUND. I was like "holy crap."

So, I think the market will "settle" for 64core, 64 bit OS, and 64GB RAM, just because they are "classic" scaling figures, and some things make sense, like each core having 1GB of it's "own" ram etc. This is ONLY a guess that I am TOTALLY pulling out of my a33. But I can ALMOST guarantee that this would be enough for as close to a theatrical gaming experience as needed.

I just can't see computing on an average level "needing" more power than what is required for "near" movie graphics. So the day a game is released and you think it is a new movie coming out is the day processors stagnate in the way soundcards and DVD burners have.
October 13, 2007 4:32:52 PM

nawfal said:
I was just wondering something, back in the day processor sizes were 130nm or something of that order, and now intel are getting ready their 45nm processors, eventually what happens?
will it just plateau out at 15nm for example?


i understand each time they drop a size, they can increase production from one wafer, and reduce heat production, but im guessing they need smaller transistors

thanks and sorry if it is very basic

I think the hard wall limit of a gate is 1.5nm's but thats of course if they solve the leakage and tunneling problems. Getting below 15nm however isn't a problem as 3D lithography will give way to have 1000's of cores stacked on top of each other. Like a building with the elevators connecting each core. 3D lithography could be used to separate sections of high heat by have them on differant layers. Future CPU cores may look more like an elevation map as apposed to current 2D ones of today.

Materials of a CPU are also be important as a superconducting material would allow us to use the entire die at a reasonable GHz with no leakage. IE on 45nm some structures need be up to 130mm apart to reduce heat.
Hear are some good article on the subjects.
http://news.zdnet.com/2100-9584-5112061.html
http://www-03.ibm.com/press/us/en/photos.wss?topic=407
http://www.scienceblog.com/cms/big-blue-crows-over-3d-chip-advance-13006.html
October 13, 2007 5:39:50 PM

thanks for that so it will basically plateau out

falken699- i often thought too that there would be some point where we didnt need upgrades anymore. i think that point has come for the average desktop user, they dont need than 2ghz. will gamers be enough to push the industry forward? we will see.
October 13, 2007 6:03:25 PM

nawfal - I hope we are wrong, sometimes I find the hardware more fun than the games!!!

But yes, I do believe a "good enough" plateau will be reached with computers. Just look at the HD-DVD vs. Blu-Ray battle... Meanwhile the average consumer is perfectly happy with regular DVD on a 49$ player. Oh, and the movies still suck Hi-Def or not, and all the DRM BS that the consumer has to pay for (HDCP) and new players costing $$$.

Three letters are the biggest fear in the tech industry:

MEH.

"But you can simulate in 4x real-time 15 Hiroshima bombs detonating in this simulation in the background while playing Solitaire!"

MEH.
October 13, 2007 6:35:30 PM

By the time (at least a decade and a half) they reach 20nm Quark based Processors will be accepted as worthy of R&D by the big boys and a few years later they will be the shiny new option, making nm size obsolete. Also keep you eye out for light-based processors.

Intel and AMD (and others like VIA) know about the limits set by physics and the nm size that is the smallest and still viable, and as such they are already planning for the next revolution/evolution of processors. So the real question is not how small, but what?

Quark based CPU's?
or
Light Based CPU's?

although light based cpu's may use quark-technology for their gates.

Ciao.
October 13, 2007 7:01:11 PM

WOW! born is them things there turbocharged? :p 
October 13, 2007 7:24:35 PM

Not turbocharged but a necessary evolution.

Gates that are getting smaller and smaller have to use different materials and technologies and quark and or light based cpu's solve this and many other problems including heat buildup.

Also light based CPU's can communicate faster than electron based CPU's due to far less loss of signal or leakage.

We will still have to wait at least 15 years or so.

PlayStation 7?
October 13, 2007 9:26:14 PM

You will ALWAYS need to upgrade eventualy, its called evolution, the pc will evolve just like the human does. It hasnt stopped in 30 years and i doubt it will in the next 30. People always said in about 1980 something like "imagine what it will be like to play computer games as if they where real" and then the next person replied "yeah right like that will ever happen"........Get my point?
October 13, 2007 9:48:25 PM

yes but eventually graphics will get lifelike- then there isnt much more improvement

anyone who played pacman in the 80s knew that graphics had a long way to go. now we are getting much closer
October 13, 2007 10:00:02 PM

When falken is happy with his usual user screen size x2 i'll be using the full wall display. Pretty much all hitech stuff hits rock bottom prices after a few years and it's then low tech.
I do agree with what he says about the DVD stuff though.
As for tech stagnation, nah can't see. Although i think audio is getting there. I've got 5.1 and 7.1 setups, 5.1 is managable but 7.1 is just hassle. Every now and then will set speakers up in living room for the best sound and seating positions to watch a film but it's just hassle.
October 13, 2007 11:48:04 PM

I would guess that within the century computers will be fast enough to be able to produce a 3D image greater than that which the eye can process in greater than real time. Sound will be produced through 1000+ channels which will commonly be build into wall panels, floor tiles, and the ceiling giving a truly 3D audio expirence. The projector for the image will be built into a floor tile. They will be able to calculate the most complex physics problems that people encounter in daily life, down to the movement of an eyebrow hair as sweat drips across your forehead or the movement of each fly in forest scene, let alone the ability to interaction in what ever way you choose with every item in the digital space and the items interactions with each other. Fire and water will look exactly as they do in real life. The only calculations that I feel will still be out of reach, even for supercomputers of the time, will be the human mind and huge environmental simulations such a plant or larger in which real life is predicted exactly down to where each rain drop will hit the ground. (This may take 150 years) Though AI may not be perfect, its programed intelligence will still be good enough to beat all but the most well trained person in any activity, even then, it will be a struggle for the person and he will need many restarts.

What more could a person want than better than real life graphics and sound, smart enough AI to beat everyone, next to real simulations, and enough space to store every piece of knowledge known to man. However that 3D video at 1,000,000x1,000,000x1,000,000 pixels really eats storage space fast. They better get working on some new storage solution, at least it does not have to be portable as wireless data transfer will be able to beam your info to anywhere within an AU in 8 minutes.

After all this technology can fit into a 12" cube, then I feel that no one will need to upgrade ever again.
October 14, 2007 5:02:45 AM

When they infuse my brain inside a 3.5 bay and shove it in a server rack with all the other people of my city, and let me play Harvest Moon, Diablo II + Act VI-X, and Commander Keen all day, I'll be satisfied.
October 14, 2007 2:41:58 PM

Not a lot has happened in the PC desktop market over the past 20 years. All we've seen is an exponential evolution of the same processes. Heck, PCs today still do the same things they did 10 years ago, only now we have more choices.

October 15, 2007 12:59:44 PM

Are Quark based CPU's in the same area as Quantum Computing. Read something about QC the other day that was interesting. Making some breakthroughs there.
October 15, 2007 1:51:13 PM

Actually, I remember back in the day of the 486 CPU, the manufacturers where mentioning that CPU manufacturing tech will scale down to a minimum of 80nm because below that threshold there will be quantum effects such as electrons leaving their path because of the small size of the nanowire. Companies have worked through that problem by experimenting with varius material and everyday we hear about new tech on the subject. Still, with 45nm present (almost) and 32nm CPU's anounced (IBM) and 16nm in 2010, a lot remains to be seen. God, I wish I had a time machine so I leap forward a decade or two!!
December 12, 2008 12:37:09 AM

In Reply to bornking,

Light may be possible, but quarks definetly not due to the fact that they cannot exist individualy.Look it up on this wiki page
http://en.wikipedia.org/wiki/Quark
I know its on wikkipedia but ive looked it up further and this is right.

Maybe youre going one step too far why not try something like protons which have charge and spin meaning that they can be manipulated and moved,they also have a size of less than nano meters(nm). It can be manipulated like in Nuclear magnetic Resonance spectroscopy where the spin on the protons(proton properties at http://en.wikipedia.org/wiki/Proton) in a material sample are alligned in the same direction using a strong magnetic field and then the protons are flipped back to their ground state using an RF pulse of the correct frequency.NMR see http://en.wikipedia.org/wiki/NMR. By the way a proton is an ionised hydrogen atom i.e a hydrogen atom without the electron. For that matter the electron can and is being used today but is also used in a modern spectroscopy called Electron spin resonance or electro paramagnetic resonance see http://en.wikipedia.org/wiki/Electron_spin_resonance. May be analogues of these techniques could lead to new technologies as a lot of reasearch has already been carried out.
a c 127 à CPUs
December 12, 2008 1:14:30 AM

^Wow. Necroed.

But just to let you know Intel and USC have created a way to put lasers that constantly emit light on silicon.
a b à CPUs
December 13, 2008 8:58:25 AM

They'll figure out a way. They are very smart. Intel is getting back into Metal gates and etc.
!