Future CPU's - Just traffic cops ???

riksta

Distinguished
Mar 30, 2005
42
0
18,530
I guess this is a spin-off of the Ageia topic, but what gives ??? I recently heard somewhere that an AI card is also in development for games. And just last week there was news of someone programming a GPU to do video encoding (I think??) which was able to perform this task five times faster than an FX53.
All those unused PCI slots may have some use after all. Will the future CPU just act as an information traffic cop sending graphics tasks to the GPU, physics to the PPU, AI to the... AIPU?? It may well be that IBM/Sony's Cell solves all this by having SPE's which I believe can operate as separate or cooperative processing units, and take on any role like several Samuel L Jacksons jumping from Jacky Brown to Star Wars in the blink of an eye. Or are we going to end up back in the 80's with too many directions and choices ??
 

dorion

Distinguished
Nov 10, 2005
92
0
18,630
CPU
GPU
SPU(or whatever you call a sound processor)
PPU
AIPU(I think this is funny think "eye-pew")

Any many more, all specialized, if they can be power efficent it might be a better than our current setup.
 

mpjesse

Splendid
You have made a very astute observation. I often stay up at nights thinking of what the future holds in computing. This will definitely keep me up tonight.

My first and immediate thoughts are these:

I think you're on to something. So many tasks are being offloaded to other devices that you have to wonder where future CPU's fit in. However, currently this is only happening in gaming. The only other major thing (as you mentioned) that's been offloaded to GPU's is video encoding. ATI's r520 chips have built in encoding that is (or about) to be enabled by a new driver. And yes, it is up to 5x faster than an AthlonFX. Given that GPU's are now more complex than CPU's and have dedicated memory (at ridiculous speeds) i think this was bound to happen eventually. I think that trying to apply any other use for a GPU (say audio processing or AI) may not work. Trying to get a GPU to do complex predictions (like AI) would be like trying to get an artist to think logically. Now I'm off on a tangent. Anyways what I mean is I think GPU's will always be confined to video and graphics tasks. The only exception I can see is maybe a video card manufacturer ADDING a seperate AI chip or physics chip.

Over the years a lot of things have been integrated into both CPU's and North/South bridges. The nForce 4 chipset for example does almost everything that seperate chips used to do in the past. Audio, LAN, RAID, IDE/SATA, PCI, PCI-e, video (in some cases), and a ton of other crap.

CPU's can't claim too much, but MMX, SSE/2/3, 3DNow, no-execution bit, and a slew of other things have been thrown into CPU's as well.

In the 80's & 90's we had seperate cards and chips for everything that is mentioned in the above 2 paragraphs. Although it looks like it, I don't think we're going backwards. As I said before, specialized cards and processors have been confined mostly to gaming (thus far).

Another good observation is Sony's Cell and it's SPU's. Sony may be on to something there... even if their processors are only ever used in the context of gaming and digital media.

Who knows what the future may hold? Did anyone around here imagine that it would be possible have 1/2 TB hard drives, 512MB of video memory, 52X CD burners, 16X DVD burners, 6mbit cable broadband, 800mhz GDDR RAM, and 4GB of memory 8 years ago? I sure didn't- at least not this soon. (the biggest thing I never thought would happen so quickly is megabit broadband)

Great topic. :)

-mpjesse
 

hergieburbur

Distinguished
Dec 19, 2005
1,907
0
19,780
Great thoughts, I think that everything we have been seeing lately I.E. PCI-Express, AMDs Hypertransport and integrated memory controller, vastly improved GPUs, all point to your comments. Engineers have been trying for years to eliminate the FSB bottleneck in PCs, and it seems to me that these innovations are all an attempt to distribute tasks and alleviate bus communications while improving available bandwidth. I think in the future, CPUs will be not much more than ALUs, memory controllers, and as you say '"traffic cops". That would truly make the CPU the "brain" of the PC, as it was intended to be. I think the Sony/IBM Cell architecture is a big step in that direction with the way it ditributes tasks.

I can't wait to see what we see in the next few years, especially with some of the new materials coming out.
 

hergieburbur

Distinguished
Dec 19, 2005
1,907
0
19,780
With the correct programming, GPUs can be pretty much used for anything that a CPU can. They are very similar devices, with GPUs having an optimized instruction set for handling graphics. I think that is the future of Cell Processors. Adaptable multicore processors that lend themselves to wherver the need is greatest. That will however, porbably be a programmers nightmare!
 

riksta

Distinguished
Mar 30, 2005
42
0
18,530
I think using Cell as the one and only all round CPU would initially be a programming nightmare. If you look at the PS3, the cell is a co-processor to the IBM Power CPU. I believe initially that the Power processor will probably do the majority of the work in early PS3 titles, especially from third party game houses and those games ported to all consoles and PC, because they will have similar architecture. I think Sony are keeping the real development tools to themselves, for in-house game developmet (Gran Turismo etc), the tools that really access the Cell's full potential. Let's face it, in the gaming industry money is made off games and not consoles. I don't mean to dwell on the games industry, but along with porn and the military, they are the main driving forces in the IT industry, as far as innovation goes.
 

mpjesse

Splendid
Our test implementation currently only supports 8-bit mixing due to limitations of frame buffer depth and blending operations on the hardware at our disposal. However, recent GPUs support extended resolution frame-buffers and accumulation could be performed using 32-bit floating-point arithmetic using pixel shaders. With performance comparable to optimized (and often non-portable) software implementations, GPU pre-mixing can be implemented using multi-platform 3D graphics APIs. When possible, using the GPU for audio processing will reduce the load on the main CPU and help balance the load between CPU, GPU and APU.

That came from the article you posted. It's interesting that they were able to do this... I wonder if there was any real performance benefit?

-mpjesse
 

riksta

Distinguished
Mar 30, 2005
42
0
18,530
I guess if you factor in dual or even quad SLI into this picture, things could get interesting performance wise on the GP-GPU front ???