Future of computers: Integration or specialization

wickedmonster

Distinguished
Aug 25, 2006
70
0
18,630
With the recent merger of ATI/AMD, there's talk of combining the CPU+GPU. But with AMD's Torrenza platform, there's also talk of using special chips for certain functions like math, graphics, or physics...

My question for the audience is whether do you think computers of the future will have many specialized chips or one omnipotent MEGA-CORE processor?

My bet is on the latter. The math coprocessor died decades ago and I think the same will happen with the GPU. In the future computer makers will combine them into one powerful CPU. You see it already with the CELL processor. The reason I think for this is that computers today use too much energy. I remember not too long ago, 200W PSUs were considered large. Now, 500W is common. At this rate, 1000W is not out of the question. By integrating all those functions onto one processor, you save energy. Of course performance will not be as good but the power requirements for specialized hardware will be crazy!
 

slicessoul

Distinguished
Apr 18, 2006
771
0
18,980
IMO, it'll be INTEGRATION.

Few decades ago, soundcard, IDE or RAID, LAN, Modem and another cards are just an add-on cards. Mainboard were so big in that time. Few years passing by, IDE start to be integrated in mainboard then soundcard, then some try to integrated GPU but the quality's suck. They are shared with memory.
At this time, Mainboard are more smaller but full with features. Sound card, LAN, IDE/RAID and another features are integrated in one mainboard. What you need to do is buy a processor, hdd, memory, video card and power supply then you can run you pc.

SPECIALIZATION will be developed more slower than INTEGRATION. Most people will need a multi purpose device for their lives. for example : why they have to buy a fax, a scanner and a printer if they can buy an All-In-One Printer which is more cheaper and less hassle.

Integration of CPU+GPU is a really bad idea. People won't have a choice then. Well it's good that they are merged, so AMD can make a 0.13nm or less production process for their processor so they can make more cores, less wattage and less heat.
 

1Tanker

Splendid
Apr 28, 2006
4,645
1
22,780
IMO, it'll be INTEGRATION.

Few decades ago, soundcard, IDE or RAID, LAN, Modem and another cards are just an add-on cards. Mainboard were so big in that time. Few years passing by, IDE start to be integrated in mainboard then soundcard, then some try to integrated GPU but the quality's suck. They are shared with memory.
At this time, Mainboard are more smaller but full with features. Sound card, LAN, IDE/RAID and another features are integrated in one mainboard. What you need to do is buy a processor, hdd, memory, video card and power supply then you can run you pc.

SPECIALIZATION will be developed more slower than INTEGRATION. Most people will need a multi purpose device for their lives. for example : why they have to buy a fax, a scanner and a printer if they can buy an All-In-One Printer which is more cheaper and less hassle.

Integration of CPU+GPU is a really bad idea. People won't have a choice then. Well it's good that they are merged, so AMD can make a 0.13nm or less production process for their processor so they can make more cores, less wattage and less heat.
I hope not, because if everything is integrated into the "motherboard"(or whatever name it would be called by then), and the mobo dies....you loose everything. At least now, if your mobo dies, or you want a better one, you can swap it out and still keep your current level of graphics, sound, etc.
 

slicessoul

Distinguished
Apr 18, 2006
771
0
18,980
Low end markets will take advantage of integration while mid and high end will stay with specialisation
I'm not really agree on this.
IMO, Integration will be applied from Low Market until High End. With integration it will be more choices. People like to choose depending on their pocket.

Specialization will stay on High End Market.
Lets see for example : Server mainboard, it's a high end mainboard because they are specialized and expensive, but in this server mainboard you can find an integration of GPU (maybe not a high end GPU), LAN, Modem, Memory, processor, HDD.
Low end market ex : you build a 1000$ pc (maybe less), you choose mainboard which is in general already integrated with LAN and soundcard or even a modem. You buy your HDD, GPU, Memory, processor and power supply. you can use you PC for games and a server too.
 

Action_Man

Splendid
Jan 7, 2004
3,857
0
22,780
IMO, Integration will be applied from Low Market until High End.

Well you'd be wrong. You only have to look at ATIs and Nvidias line ups.

With integration it will be more choices.

Um integration = less choice.

Specialization will stay on High End Market.
Lets see for example : Server mainboard, it's a high end mainboard because they are specialized and expensive, but in this server mainboard you can find an integration of GPU (maybe not a high end GPU), LAN, Modem, Memory, processor, HDD.

People dont use server mobos for desktops.
 

slicessoul

Distinguished
Apr 18, 2006
771
0
18,980
Um integration = less choice.
Maybe i have to say Integration with possibilities of expansion would make more choices ?

Well you'd be wrong. You only have to look at ATIs and Nvidias line ups.
Yes, maybe i'm wrong, so can you fill me what i have to look on those line ups ?

People dont use server mobos for desktops.
Specialized system only for special occasion, everything is special including the price too don't you think ?
 
There's no silver bullet with this...the "future" will end being much like it is today with a mixed market of integrated and specialized products, processors, and platforms design to meet the needs of a market niche and based on functionality. Don't think it's reasonable to say that the same processor used in a multimedia cell phone or PDA will be the same processor used by a 3D CAD/video rendering workstation, or the same processor would be used in a desktop office machine as in a server in the data center.

I imagine it will be a mix of integrated and specialized processors and chips. The potential for combining the integrated and specialized procs into a platform is what's going to allow AMD/ATI, Intel, IBM, Texas Instruments, etc to create and market new products and services. I think there is far more flexibility and room for innovation with a mix of both.
 

wolfman140

Distinguished
Jun 6, 2006
297
0
18,780
I agree with a market mix...Machines like Dells or EMachines may be all integrated, but I think separate component computers will live on strong. Someone already made the good point of if something dies, your whole CPU/Mobo combo is dead? Geeks like us always want to upgrade/change our video cards and hardware, limiting us to having it all integrated would be commericially suicidal for the entire industry. So no, I highly doubt integration is the future of all computers.
I bet they may sell integrated computers like that to businesses for workstations or audio/video that kind of thing (as they already sort of do). But the rest of us home PC people, I'm very confident it will remain on a platform where you can change/swap gpus, networking cards, etc etc etc.
 

m25

Distinguished
May 23, 2006
2,363
0
19,780
The funny thing is that integration and specialization in some view are the same thing; more specialized a PC is the more integrated it becomes. The variety of choices will continue to grow as it has done until now, even if they fit a GPU into the CPU you will then have a much larger variety of CPUs with different graphics; CPU gamma multiplied by GPU gamma.
The market evolves by requests, not by scheduled roadmaps (that's only with technology). We have all this variety of products now because they're all needed.
 

Cabletwitch

Distinguished
Feb 3, 2006
103
0
18,680
Interesting topic.

Certainly, integration will be used for business machines, as office's and the like have no need for fancy graphics or surround sound, let along physics. The budget machines on sale in most shops will always feature a degree of integration, as this is how they stay so cheap.

As for mid level to high end, I'd like to belive specialisation will play a major factor. After all, its easier to concentrate on specific hardware as a designer/manufacturer, than try and cover a wide range of abilities and tasks.

I'd like to see the CPU taking a more administrative role in future computing, and the discreet systems handling most of the work. A good example of this was the old Amiga series, the CPU in those was quite limited, it was the chipset and co-pro that did most of the work. It also had the upshot of being able to do true multitasking, without the CPU doing all the workin timeframes.

Now that we have faster interconnects for peripherals, it would be daft to try and integrate everything. Flexibility has always been a major part of the PC, and integrated systems usually lack this.

This post might seem a bit disjointed, and I'll probably come back and edit it later when I'm actually awake. But its a good topic for debate, granted.
 

Action_Man

Splendid
Jan 7, 2004
3,857
0
22,780
Maybe i have to say Integration with possibilities of expansion would make more choices ?

Specialisation with expansion provides more choices.

Yes, maybe i'm wrong, so can you fill me what i have to look on those line ups ?

If you have to ask that question, well I won't be a total prick about it. Look at memory used, the clocks, die sizes.

Specialized system only for special occasion, everything is special including the price too don't you think ?

No.
 

nobly

Distinguished
Dec 21, 2005
854
0
18,980
Good post, good topic ---
My money is on hetrogeneous multi-core processors --- Cell is a preamble of the sign of the times in my opinion. AMD and Intel processors will segment along application lines rather than just market lines (i.e. mobile, DT, server).
What I mean by this is some processors will be constructed to be real video power houses, with cores specifically designed to handle video/audio streams. Other processors will be 'gaming processors' which will have mult core functionality to number crunching physics, AI etc. while others may be XML crunchers (the web and data base applications).
Once we get to the point of 16 or more cores, the mix of specialized units can change the overall purpose of the processor. So what I see is a processor that can do everything but you can buy various processors that are better at one thing over the other just by increasing the core count on that one thing.
Jack
I agree in part.
I think people are realizing that once we start jumping up the core count, you're going to have lots of cores just sitting around doing nothing. Even in software, designing to have multiple cores has some limit. (and we don't want to promote crap coding practices either).

I see having tons of cores beginning to have the diminishing return factor. Software can be threaded, but again, can we code for 100 cores? If I had a quad-core now, would it be really used? Ok how about an 8 core? Not really. I could probably use dual right now, but quad/octo is a little much for me.

It also boils the question down to what is a core? Should we consider a core to be a full working processor that does everything? (Like today's cores) Or should the Cell cores be considered cores without its primary core?

The way I see it, we're going to end up with more cores than we know what to do with. Then it'll be specialized inside of that core. Some cores will excel at integer functions, some at floating point, some will be great at general instructions, some specialized in 2D/3D, etc.

It'll be like the Cell processor, but able to do much much more and not as specialized. Overall it would be a much powerful processor, but would still retain its 'general purpose' label, like today's processors.
To be more clear, it'll be like today, but we'll have 32 cores, but they'd be specialized into perhaps 4 packages of 8 cores that can handle much much more than if you just had 32 cores sitting there waiting to do the same thing.

I'd rather the GPU be separate since its a very visual piece of hardware. We're still ever striving to get to that point where GPUs are powerful enough to replicate the real world. And of course, we're always wanting a bigger screen to see that real/fake world in. GPUs have a long ways to go still, and I think they should be separate in order to maintain upgradability/expansion.

Shrug, my 2 cents and ideas.
 

sojrner

Distinguished
Feb 10, 2006
1,733
0
19,790
With the recent merger of ATI/AMD, there's talk of combining the CPU+GPU. But with AMD's Torrenza platform, there's also talk of using special chips for certain functions like math, graphics, or physics...

My question for the audience is whether do you think computers of the future will have many specialized chips or one omnipotent MEGA-CORE processor?

My bet is on the latter. The math coprocessor died decades ago and I think the same will happen with the GPU. In the future computer makers will combine them into one powerful CPU. You see it already with the CELL processor.
Yet even w/ all the potential of CELL, they still have a dedicated gpu in the ps3. I think that much of the "power" in an architecture like CELL (one main core w/ many specialized cores) is largely specialized in its own way. It may work great on some things, but for graphics/games it looks to be wasted. JMO of course.

The reason I think for this is that computers today use too much energy. I remember not too long ago, 200W PSUs were considered large. Now, 500W is common. At this rate, 1000W is not out of the question. By integrating all those functions onto one processor, you save energy. Of course performance will not be as good but the power requirements for specialized hardware will be crazy!

And that is why integrated will not replace specialized, stand-alone components. I agree that power use is high. There are already systems w/ 1Kw psu's and that is insane to me as well. BUT, I am not prepared to sacrifice quality for efficiency. I do not like integrated sound, you can hear so much feedback from the mobo it's not even funny. (even the venerable soundstorm chip that was so great has tons of feedback on the asus mobo I have) I do not like integrated graphics as the quality... well just look at the gma950 from intel and tell me you would be satisfied w/ that. (I would not even drop my 9700pro, a 4 year old gpu for that "modern" chip!)

in all fairness... integrated network is hit or miss, but when it hits it is great. (thinking Nvidia controllers on that one)

The status quo is not changing any time soon. ppl in want/need of performance and quality will not sacrifice that for integration and better power. The trick for power is can you be efficient and increase performance like intel did w/ that core2?
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
At this rate, 1000W is not out of the question.

You can find enthusiast power supplies at 1100 watts now.

WRT your main topic, I think you'd need to define what you mean by "future". If you think of the early 80s as the dawn of PCs, then we're looking at about 25 years of history for PC technology as most people know it now. I'm expecting to see significant evolution in integration and specialization in just the next 5 to 10 years. So if you're talking about the next 25 years as the future, that means one thing whereas a look a couple hundred years into the future is almost scary. So many variables, so many assumptions to be made...
 

joset

Distinguished
Dec 18, 2005
890
0
18,980
"Integration or specialization"

Perhaps the issue should be more like 'Specialization vs General Purpose", instead of "Integration".
In my opinion, Specialization will co-exist with GP in all targets; it'll be - mostly - a matter of degree (features/performance/cost/...); specific niches (like CAD/Simulation/Multimedia/...) will be addressed... well, more specifically, as it happens nowadays.
Cell is, perhaps, just a short example of what might come; it's a close relationship between a GP Processor & several specialized FP units; take Intel's Core, for instance: Its co-processing units are so buried deep into the microarchitecture that one can hardly call it "co-processors" (valid for most of today's CPUs).
"Integration" seems to be the trend: Actually, CPUs perform more specific tasks, are much more powerful & feature rich than a decade ago, when specialized units were required to perform equivalent tasks.
But, among many other factors, the degree of integration also depends upon manufacturing processes, where power issues might have dominance, not to mention the impact on platform re-design. But, if Centrino-like integration was to set a new trend in specialized & GP clustering, I believe that, power-related issues aside, the closer the better.

As a side note, Intel & IBM (to my knowledge) have already addressed "stacked-dies" CPUs as a viable improvement; AMD/ATi, more than ever, might be on the same track. It's also worth noting that, integration is not only resumed to RAM/CPU/GPU & respective interconnects: it encompasses from ASICs, FPGAs & Comm chips, as well...

My specific view on the subject, of course.


Cheers!
 

ThaCriminal

Distinguished
Aug 17, 2006
43
0
18,530
IMO, it'll be INTEGRATION.

Few decades ago, soundcard, IDE or RAID, LAN, Modem and another cards are just an add-on cards. Mainboard were so big in that time. Few years passing by, IDE start to be integrated in mainboard then soundcard, then some try to integrated GPU but the quality's suck. They are shared with memory.
At this time, Mainboard are more smaller but full with features. Sound card, LAN, IDE/RAID and another features are integrated in one mainboard. What you need to do is buy a processor, hdd, memory, video card and power supply then you can run you pc.

SPECIALIZATION will be developed more slower than INTEGRATION. Most people will need a multi purpose device for their lives. for example : why they have to buy a fax, a scanner and a printer if they can buy an All-In-One Printer which is more cheaper and less hassle.

Integration of CPU+GPU is a really bad idea. People won't have a choice then. Well it's good that they are merged, so AMD can make a 0.13nm or less production process for their processor so they can make more cores, less wattage and less heat.
I hope not, because if everything is integrated into the "motherboard"(or whatever name it would be called by then), and the mobo dies....you loose everything. At least now, if your mobo dies, or you want a better one, you can swap it out and still keep your current level of graphics, sound, etc.

All you have to do is get a new one and you got a new set of integrated components :p
 

Cabletwitch

Distinguished
Feb 3, 2006
103
0
18,680
IMO, it'll be INTEGRATION.

Few decades ago, soundcard, IDE or RAID, LAN, Modem and another cards are just an add-on cards. Mainboard were so big in that time. Few years passing by, IDE start to be integrated in mainboard then soundcard, then some try to integrated GPU but the quality's suck. They are shared with memory.
At this time, Mainboard are more smaller but full with features. Sound card, LAN, IDE/RAID and another features are integrated in one mainboard. What you need to do is buy a processor, hdd, memory, video card and power supply then you can run you pc.

SPECIALIZATION will be developed more slower than INTEGRATION. Most people will need a multi purpose device for their lives. for example : why they have to buy a fax, a scanner and a printer if they can buy an All-In-One Printer which is more cheaper and less hassle.

Integration of CPU+GPU is a really bad idea. People won't have a choice then. Well it's good that they are merged, so AMD can make a 0.13nm or less production process for their processor so they can make more cores, less wattage and less heat.
I hope not, because if everything is integrated into the "motherboard"(or whatever name it would be called by then), and the mobo dies....you loose everything. At least now, if your mobo dies, or you want a better one, you can swap it out and still keep your current level of graphics, sound, etc.

All you have to do is get a new one and you got a new set of integrated components :p

What, instead of simply replacing the one part that failed? I wouldnt assume for one second things like RAM and the CPU are ever going to be integrated, certainly not for the midrange to highend. Otherwise, there wouldnt be a lot less of a market for upgrades. I'm also fairly certain that people will NOt want to buy a whole new system simply because the graphics chip has been improved, or the sound system is better. even in low end systems, its easier to have specialsation, as the consumer then has the choice for upgrading.

Maybe people I know have bought all in one systems in the past, and have been pissed off when they found they couldnt upgrade when the machine was obselecent.
 

crazypyro

Distinguished
Mar 4, 2006
325
0
18,780
This topic is subjective in that established companies will try to integrate future specialized hardware in there current products to keep control of the market. We are currently seeing this with ATi and nVidia trying to do physics via multi-gpu configurations. They're utilizing there technology to integrate the PPU into the GPU, which has yet to be tested to see if this method is better than the PPU by Ageia. Now AMD/ATi are working towards the CPU-GPU, how they'll be interwined to work together has yet to be seen as well and how well the perfomance we can get out of it. And just recently I read an article on ArsTechnica (here) about a dedicated AI processor. And of course we can't forget BigFoot technologies Killer NIC dedicated network card.

Its obvious that dedicated hardware is better than a multifunction device such as CELL, due to the master core being fought over between the other integrated cores. It is inevitable that computers will be moving towards integration as the public demands smaller, slimmer, and more stylish products. I know several people who bought computers, TVs, and other electronics based on the looks over the actual performance. I agree with the one posters comments. Integration will rule the low to mid end markets and OEM platforms, but specilized hardware will rule the performance seekers and task heavy users. Only if we can move things off the old PCI ports and on to PCI-X or PCI-e, so we can run our PPUs, X-FI's, AIPU (artificial intelligence processing unit), and our Killer NIC cards.

just my thoughts
 

wickedmonster

Distinguished
Aug 25, 2006
70
0
18,630
If Intel and AMD integrate the GPU+CPU, Nvidia won't survive even if they're faster. There won't be enough money to support the R&D required to make GPUs. Profit margins are low enough as it is. If Nvidia dies, so does discrete graphics.
 

Cabletwitch

Distinguished
Feb 3, 2006
103
0
18,680
Sure they will. Again, how many people will want to buy a new processor everytime a new graphics revision comes out? The only GPUs that will EVER get integrated, if at all, will be the exteremly low end ones. In which case, these will feature inembedded processors only, and wont come near the mainstream home user market.

Plus of course, the more you integrate into a CPU, the more points of failure you have on the silicon. I dont think either AMD or Intel would take kindly to an increase in fail rates, simply because a GPU failed, would you?

Thats assuming the cost of each wafer is still anywhere near worth producing. Adding more to the die design means larger cores, which means less chips per wafer, thus reducing output. More wafre = more cost, and prices will rise in order to offset this extra expenditure. Not a fun prospect, really.

Then of course, there is the heat issue. GPUs run quite a bit hotter than most CPUs, mine can apparently peak close to 120 celcuis before throttling back. Thats quite a bit hotter than even a Prescott. not a pretty scenario. Oh yes, then you have the graphics RAM. Main RAM isnt fast enough really, and it also deprives the system of resources. This is again a bad scenario.

The last problem will be the core size. Integrate a GPU into a CPU, and the resulting package will be quite a hefty piece of kit. Look at the pin count on todays processors, then pretty much double it. Fancy building that socket and wiring it up? Thought not. Theres only so much you can do with pin density, and adding more isnt the solution.

All in all, nVidia doesnt have to worry. It supplies the mobile market with GPU's, and the mainstream will never accept an integrated GPU/CPU solution as the be all and end all. Its sales will remaind strong, and it will be able to compete. Innovation, not integration, is the key.
 

joset

Distinguished
Dec 18, 2005
890
0
18,980
(...) Plus of course, the more you integrate into a CPU, the more points of failure you have on the silicon. I dont think either AMD or Intel would take kindly to an increase in fail rates, simply because a GPU failed, would you?

Thats assuming the cost of each wafer is still anywhere near worth producing. Adding more to the die design means larger cores, which means less chips per wafer, thus reducing output. More wafre = more cost, and prices will rise in order to offset this extra expenditure. Not a fun prospect, really.

Then of course, there is the heat issue. GPUs run quite a bit hotter than most CPUs, mine can apparently peak close to 120 celcuis before throttling back. Thats quite a bit hotter than even a Prescott. not a pretty scenario. Oh yes, then you have the graphics RAM. Main RAM isnt fast enough really, and it also deprives the system of resources. This is again a bad scenario.

The last problem will be the core size. Integrate a GPU into a CPU, and the resulting package will be quite a hefty piece of kit. Look at the pin count on todays processors, then pretty much double it. Fancy building that socket and wiring it up? Thought not. Theres only so much you can do with pin density, and adding more isnt the solution.

All in all, nVidia doesnt have to worry. It supplies the mobile market with GPU's, and the mainstream will never accept an integrated GPU/CPU solution as the be all and end all. Its sales will remaind strong, and it will be able to compete. Innovation, not integration, is the key.

You've focused in some very good points which, taken as dead ends, would literally hamper any Innovation at all levels.
Integration has been part of the computing lexicon, long since the first Integrated Circuit popped up; integration is a part of Innovation as a whole and is not contradictory with the "divide & conquer" approach. Both co-exist in many degrees.
Actually, I think integration has been the subtle key, the underlying factor which drove "supercomputers" down to our hand-helds and, in the mid-term, into our own bodies.
As I've referred in a previous post, miniaturization, or the ability to transition from a process node into a smaller one (or from one technology into another, scale-wise), allowed integration to happen smoothly and, despite some marketing hype, taken as granted, hard & software wise (SSEx units, FP units, integrated power-management are mere examples)...
Some more or less consistent claims have been put forward by Intel & AMD (among others), concerning hard integration; Intel has made some references to it, in its IDF 2005 and, it'd seem somewhat naive not to take it seriously, although not for the immediate:

http://www.anandtech.com/printarticle.aspx?i=2368
http://www.anandtech.com/printarticle.aspx?i=2511
on stacked dies and on-chip North Bridge & Voltage Regulator;

or, in AMD's Analyst Day Platform Annoucements:

http://www.anandtech.com/printarticle.aspx?i=2768
on the Torrenza platform;

or, for a more down to earth, (yet expensive!) integration:

http://www.dailytech.com/article.aspx?newsid=1920
on the DRC co-processor, just to give some examples.

Perhaps it would be helpful if one thinks of Integration from the outside in, beginning with the platform itself; at least on the DT side of things, it's amazing to check the integration level of a plain, off-the-shelf mainboard...


Cheers!
 

Cabletwitch

Distinguished
Feb 3, 2006
103
0
18,680
:D Agreed. Miniturisation and integration arent always a bad thing, once the process technology has reached a point where it becomes viable. The big problem is when an attempt is made to combine two different technologies that have so far existed as seperate devices, onto one chunk of silicon.

Integration where certain technologies are used as PART of a larger device are certainly a good thing. Some nice examples are onboard networking, sound and RAID you get on most modern motherboards. Those certainly help by reducing the card count needed aftermarket, but usually you're stuck with whatever is supplied. Even so, these are still discreet devices, permenantly attached rather than all combined onto one piece of silicon.

I've lost track of where I was going, give me a minute to recover :D

.......

Ahh yes. Rather than attempt to combine everything onto as fewer chips as possible, devices should remain discreet, in my opinion. Going back tothe CPU/GPU scenario, it would be an arse if the graphics core developed a fault. What then? Either you'd have to buy a new combo chip for lots more money, or spend money to replace it with a dedicated GFX card, which would have been the better choice in the first place. Other devices like the onboard LAN are easy to replace if they fail, given that a standard 10/100 NIC will cost as little as £5 in some instances, and give you decent performance.

I would certainly be in favour if the money and research went into improving communications between components and peripherals, rather than trying to cram everything into the smallest space possible. I'll all for muti-core CPUs, but the rest of the system should stay discreet and seperate.