knownalien

Distinguished
Jan 23, 2003
371
0
18,780
Recently I was reading at Tom's how AMD had this vision of the future CPU but they talked of an APU. Advanced Processing Unit I think. Since ATI signed/merged with AMD, do I usderstand correctly that the divisions between a GPU and CPU are to be crossed. Put another way, the current computer has a CPU and a video card that sits in a PICe slot (sometimes 2) with its own ram and GPU. These GPU's are very powerful at number crunching and dwarf what CPUs can do. Games rely on your CPU and GPU primarily. OS, HD, and ram are important too, but the CPU and GPU are the most important. In my mind, we will one day buy a motherbaord where the "video card" is integrated. But it's GPU will be something that can be purchased and upgraded. Possibly there will be slots for several types of 'PUs that can all be upgraded. Is this what I understand to be the future?
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
Recently I was reading at Tom's how AMD had this vision of the future CPU but they talked of an APU. Advanced Processing Unit I think. Since ATI signed/merged with AMD, do I usderstand correctly that the divisions between a GPU and CPU are to be crossed. Put another way, the current computer has a CPU and a video card that sits in a PICe slot (sometimes 2) with its own ram and GPU. These GPU's are very powerful at number crunching and dwarf what CPUs can do. Games rely on your CPU and GPU primarily. OS, HD, and ram are important too, but the CPU and GPU are the most important. In my mind, we will one day buy a motherbaord where the "video card" is integrated. But it's GPU will be something that can be purchased and upgraded. Possibly there will be slots for several types of 'PUs that can all be upgraded. Is this what I understand to be the future?

Well, standalone GPUs aren't going anywhere for a long time. Because of the size of the GPU chip and RAM it will be difficult or nearly impossible to get both into one socket before 32nm.

There will be levels of this where certain APUs will go into a separate socket and certain ones will be on the die with the CPU. When the GPU is on the same die, they will be moreso for FP rather that graphics. Once DDR4 comes out for CPUs it may be possible to share RAM that is fast enough.
 

ajfink

Distinguished
Dec 3, 2006
1,150
0
19,280
Fusion is putting a GPU core on a processor, Torrenza is having specialized sockets for co-processors.

Fusion will increase the FPU power of processors -significantly-. Video encoding times will plummet, F@H speed will skyrocket, and cheaper laptops will be had for everyone.

But this is still a few years off and technical challenges have to be overcome, so I suppose it's even hypothetical rather than definite.
 
Recently I was reading at Tom's how AMD had this vision of the future CPU but they talked of an APU. Advanced Processing Unit I think. Since ATI signed/merged with AMD, do I usderstand correctly that the divisions between a GPU and CPU are to be crossed. Put another way, the current computer has a CPU and a video card that sits in a PICe slot (sometimes 2) with its own ram and GPU. These GPU's are very powerful at number crunching and dwarf what CPUs can do. Games rely on your CPU and GPU primarily. OS, HD, and ram are important too, but the CPU and GPU are the most important. In my mind, we will one day buy a motherbaord where the "video card" is integrated. But it's GPU will be something that can be purchased and upgraded. Possibly there will be slots for several types of 'PUs that can all be upgraded. Is this what I understand to be the future?
It will be more or less like an integrated GPU is today to the low end. The differance will be when using a standalone GPU the CPU can then utilize the integrated GPU's FPU's. This makes integraded GPU systems cheaper while at the same time allow reuse of the integraded GPU once a standalone GPU is added. This goes back to an article I read where AMD's CTO Phil Hester stated the CPU will use the FPU's of other cores as its own by way of extensions.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2565
Note both On-chip Coprocessors and FPU Extensions to AMD64. This article is dated October 14th, 2005 so the comming soon is already here with the 1207 socket FBDIMM and future goals would fit the time table of fusion it would seem. We think of integrated GPU's as crappy GPU's but even todays integrated GPU's can easily perform FPU's on the scale of a CPU.
 

knownalien

Distinguished
Jan 23, 2003
371
0
18,780
I just think about how many parts on a video card can easily be fitted onto an ATX mobo. There are a few things that really need to die off on today's mobos like floppy drive sockets and serial connections. The ram chips on a video card are tiny compared to the gigantic system RAM. Right now, when people think of "integrated" graphics they usually think "very slow." And it's true, but I think that cards can use some realestate on MOBOs. I think of the things many of us do not use . . . like all 8 serial ATA ports.

In any event, I think the one thing has to happen which I think will be a good thing, we have to move to a better standardized cooling method. Air has so many problems like noise and dust. A good water cooling system augmented by really good heatpipes I think coul usher in some really progressive designs. I know based on my own water design that the biggest headache is monitoring the fluid to make sure it doesn't get too low or grow some type of alge.
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
I just think about how many parts on a video card can easily be fitted onto an ATX mobo. There are a few things that really need to die off on today's mobos like floppy drive sockets and serial connections. The ram chips on a video card are tiny compared to the gigantic system RAM. Right now, when people think of "integrated" graphics they usually think "very slow." And it's true, but I think that cards can use some realestate on MOBOs. I think of the things many of us do not use . . . like all 8 serial ATA ports.

In any event, I think the one thing has to happen which I think will be a good thing, we have to move to a better standardized cooling method. Air has so many problems like noise and dust. A good water cooling system augmented by really good heatpipes I think coul usher in some really progressive designs. I know based on my own water design that the biggest headache is monitoring the fluid to make sure it doesn't get too low or grow some type of alge.


yeah, the problem with say RAM on the mobo is that the design has to be different from Intel. If anything they would put a small amount of E-DRAM on the die and connect to main RAM that way. Since 4GB PCs will be commonplace by the end of next year (MS will be pushing X64 heavy next year), it will be easy to give a GPU 512MB of DDR3 in 08.

I think water cooling will become more commonplace also as heatpipes are already showing up on the major boards. GPU companies are actually starting to ship with water cooling. There are also solutions that help keep the tubing clean.

Fortunately though Sun and IBM are supposedly going to make HT-compatible chips that will plug into Opteron sockets. This will make it more feasible to adopt Fusion and Torrenza designs at least on the server level.

From there it's just producing inexpensive platforms for the desktop using ccHT.
 

Retardicus

Distinguished
Aug 21, 2006
49
0
18,530
As far as I know, AMD's aquisition of ATI is a direct response to Intel's Gesher CPU. I think they talked about it back in the '06 Spring IDF. Gesher is supposed to be Intel's first CPU with on-die graphics cores. 32nm chip scheduled for 2010.

yeah, the problem with say RAM on the mobo is that the design has to be different from Intel. If anything they would put a small amount of E-DRAM on the die and connect to main RAM that way.

There is no way Intel or AMD would waste on-die CPU realestate with E-DRAM for main graphics memory. That just doesn't make sense from a cost standpoint, and is not needed for performance either. You couldn't even fit that much E-DRAM even at 32nm.

I'm pretty sure Intel explained how their method works; CPU and Graphics cores communicate to main memory through the Integrated Memory Controller on the die. Whether the graphics system get's it's own channel of FB RAM, I dunno.

I would imagine that AMD would do some sort of variation on that methodology.
 

Retardicus

Distinguished
Aug 21, 2006
49
0
18,530
I vaguely remember Gesher in an article, I need to go back and study up on this... thanks for the explanation.

I did a quick search to see if I could dig up where I got this info. Couldn't find it. Maybe I pulled it out of my ass, I can't remember. Either way, it seems to make sense.
 

Nitro_mule

Distinguished
Aug 15, 2006
4
0
18,510
I realise that the application specific unit/stream processor (whatever you like to call it) is quite interesting and it is quite obvious that it is going to be like a gold rush in the hardware computing industry if we want to satisfy our ever increasing hunger for more performance.

To date, many on- or off-die solutions have been designed/planned - such as IBM's Cell, ATI or nVidia's stream processors, AMD's fusion/torrenzo, Intel Terascale or exotic FPGA stuff like the Nallatech Revolarc or Starbridge Systems' hypercomputer.

However, if the programmers ever want to capitalize on these powerfull systems destined for the mass consumer market, there will have to be some sort of standardisation, right ? Just like the x86 ISA and its extensions unify CPUs from Intel and AMD. For the Cell in PS3, it's pretty straightforward: there is only one hardware configuration and even then, game programmers still complain that it will take some time to exploit Cell's full potential.

My point is, we're on the brink setting new standards, that could be as big as the x86 back in the old days. It may prove to become an interesting fight between the different companies involved...
 

BaronMatrix

Splendid
Dec 14, 2005
6,655
0
25,790
As far as I know, AMD's aquisition of ATI is a direct response to Intel's Gesher CPU. I think they talked about it back in the '06 Spring IDF. Gesher is supposed to be Intel's first CPU with on-die graphics cores. 32nm chip scheduled for 2010.

yeah, the problem with say RAM on the mobo is that the design has to be different from Intel. If anything they would put a small amount of E-DRAM on the die and connect to main RAM that way.

There is no way Intel or AMD would waste on-die CPU realestate with E-DRAM for main graphics memory. That just doesn't make sense from a cost standpoint, and is not needed for performance either. You couldn't even fit that much E-DRAM even at 32nm.

I'm pretty sure Intel explained how their method works; CPU and Graphics cores communicate to main memory through the Integrated Memory Controller on the die. Whether the graphics system get's it's own channel of FB RAM, I dunno.

I would imagine that AMD would do some sort of variation on that methodology.

The ATi Xenon(?) chip in XBox 360 has 10MB of EDRAM.
 

evilr00t

Distinguished
Aug 15, 2006
882
0
18,980
As far as I know, AMD's aquisition of ATI is a direct response to Intel's Gesher CPU. I think they talked about it back in the '06 Spring IDF. Gesher is supposed to be Intel's first CPU with on-die graphics cores. 32nm chip scheduled for 2010.

yeah, the problem with say RAM on the mobo is that the design has to be different from Intel. If anything they would put a small amount of E-DRAM on the die and connect to main RAM that way.

There is no way Intel or AMD would waste on-die CPU realestate with E-DRAM for main graphics memory. That just doesn't make sense from a cost standpoint, and is not needed for performance either. You couldn't even fit that much E-DRAM even at 32nm.

I'm pretty sure Intel explained how their method works; CPU and Graphics cores communicate to main memory through the Integrated Memory Controller on the die. Whether the graphics system get's it's own channel of FB RAM, I dunno.

I would imagine that AMD would do some sort of variation on that methodology.

The ATi Xenon(?) chip in XBox 360 has 10MB of EDRAM.
Backed up with 512M of 700MHz GDDR3 super fast texture memory, yes.
In case you don't understand what I meant, EDRAM is NOT used for textures, it's used for the frame buffer.
 

JMecc

Distinguished
Oct 26, 2006
382
0
18,780
GPU's are good at doing Vector/Matrix math, something unused by most programs. It is very useful to graphics and scientific computation though, so people using a lot of vector calcs in computer science, genetics, math... want to use the gpu for these calcs. The cpu is still a better general purpose processor though, so AMD wants to get the best of both worlds by using a [hopefully customizable] mix of some gpu & some cpu for use by the main system.

Jo
 

halbhh

Distinguished
Mar 21, 2006
965
0
18,980
Though we have to wait to see it, my impression from reading that great article is that you'll have a *lot* of "APU" sockets on the motherboard, like 4 would be small. 6, 8....??

So, if you want gaming, you drop in like 6 graphics apu chips and 2 cpu chips in your 8 sockets....

This is why the standalone card might dissapear eventually.

Also, this is why AMD's 4x4 might someday evolve into something attractive to many.
 

TRENDING THREADS