Sign-in / Sign-up
Your question

Integration of the CPU on the GPU, what can it yield?

Tags:
  • CPUs
  • GPUs
  • AMD
Last response: in CPUs
January 6, 2007 5:37:19 PM

Hi,

Many times we have talked about integration of the GPU on the CPU, but we didnt do this for the integration of the CPU on the GPU, ( I mean on die CPU and GPU but giving more real estate to the GPU), what do u think its suitable apps are, AMD have already stated they are doing this, do u think AMD can market such a product for desktops?

More about : integration cpu gpu yield

January 6, 2007 6:10:23 PM

Defeats the object of general purpose processors IMO.

Oh yeah and PC gamers are a small part of the market, although AMD don't seem to care about stuff like that *COUGH* 4x4.

Great for a console maybe...?
January 6, 2007 6:25:00 PM

AMD is putting the GPU in a CPU die.

I think your referring to putting the CPU on the GPU PCB or GPU die?

in which case short answer would be latency issues, although as a Stream Processor or a plug-in system for server admins and diagnostics, i believe its been done, but not 100% sure so don't quote me
Related resources
January 6, 2007 7:20:40 PM

You just made me think of something else.
Last time i checked i couldn't plug a monitor into my CPU. This will require some sort of motherboard support anyway.

I think its a bad idea. My experience with integration is that it is cheaper but doesn't perform as well. Obviously this does depend on certain things, but put it this way, if you just bought a sony bravia, would you care if it had an integrated dvd player? We already have integrated graphics too. It serves its purpose but isn't that great.

Motherboards are more integrated these days. We never used to have integrated sound and network and all the other things that come along with a mobo these days. Those components aren't very expensive though and they aren't a deciding factors in a build.

Putting the two most expensive parts in a system on 1 chip could prove a very bad idea.
January 6, 2007 8:46:26 PM

Quote:
You just made me think of something else.
Last time i checked i couldn't plug a monitor into my CPU. This will require some sort of motherboard support anyway.

I think its a bad idea. My experience with integration is that it is cheaper but doesn't perform as well. Obviously this does depend on certain things, but put it this way, if you just bought a sony bravia, would you care if it had an integrated dvd player? We already have integrated graphics too. It serves its purpose but isn't that great.

Motherboards are more integrated these days. We never used to have integrated sound and network and all the other things that come along with a mobo these days. Those components aren't very expensive though and they aren't a deciding factors in a build.

Putting the two most expensive parts in a system on 1 chip could prove a very bad idea.


when you put things in that perspective one thing comes to mind... Bottleneck

Dedicated hardware can only go so far, before it becomes cost inefficient, thats where integration comes into play at the sacrifice of the performance.

Trade offs! what a itch
January 6, 2007 9:37:19 PM

Quote:
Hi,

Many times we have talked about integration of the GPU on the CPU, but we didnt do this for the integration of the CPU on the GPU, ( I mean on die CPU and GPU but giving more real estate to the GPU), what do u think its suitable apps are, AMD have already stated they are doing this, do u think AMD can market such a product for desktops?


There's a reason CPU's are of a different architecture than GPU's.

GPU's are best at floating point calculations, and CPUs have a much broader instruction set, and are designed to multi-task well, and handle a large number of different IO's. GPUs are designed really just to accept GFX instructions.

In other words, the GPU lacks a lot of necessary functionality to just "drop-in" CPU functionality. The entire processor would have to be re-worked. The CPU, on the other hand, is capable of doing most of what the GPU can do, albeit less efficiently. This is why Intel, IBM, and AMD are 'borrowing' architecure from GPUs and integrating them into CPUs for faster flowting-point processing. The cell-processor looks more like a GPU than a CPU from an architural point-of-view, as does the new 80-core processor from Intel.

The added CPU-like functionality would slow down a GPU, which is why they're remained seperate until now.
January 7, 2007 11:30:32 AM

i meant placing the CPU on die with the GPU.

look http://www.dailytech.com/article.aspx?newsid=3471

notice the gfx centric part of the diagram. I remember AMD saying that such a product will be targeted at mobile phones, PDAs, what other apps can u think off, wont the CPU be a bottleneck?
January 7, 2007 11:42:25 AM

Actually, it's pretty slick. You get the massive floating point power of an ATi chip coupled with the OoO and general processing capabilities of an x86. Physics simulations and other compute intensive programs would benefit tremendously from this. Basically you get the strong parts of an athlon and a cell with none of the drawbacks.
January 7, 2007 11:55:08 AM

I think it's an amazing idea, and at worst it would lead to better integrated graphics.
January 7, 2007 12:14:20 PM

but wont a CPU bottleneck the whole rendering process? not much, Till now, Fusion is set to be released in 2009 for desktops, so it has to support DX 10.1 , which allows for fewer calls for reflections and refractions. For example if u have 6 GPU cores and 2 CPU cores, here is how i think everything can be done

1 CPU core for preparing data for the GPU
\\ \\ \\ for processing AI and game code
2 or 1 GPU cores for physics
4 or 5 GPU cores for gfx
January 7, 2007 12:26:48 PM

might actually be a good thing for other devices that arent pcs, like consoles, or portable devices. but it just doesn't seam practical in the world of pcs, the world of pc's and cpu's are totaly different, new architecture for gpu's comes out every 6 months, cpu architecture cheanges every 2-3 years or so. also what advantage would you be getting from something like that? like every one else it would more likely just bottleneck that cpu/cpu device
January 7, 2007 2:09:05 PM

Ya, except this has absolutely nothing to do with graphics.
January 7, 2007 4:44:25 PM

Quote:
Hi,

Many times we have talked about integration of the GPU on the CPU, but we didnt do this for the integration of the CPU on the GPU, ( I mean on die CPU and GPU but giving more real estate to the GPU), what do u think its suitable apps are, AMD have already stated they are doing this, do u think AMD can market such a product for desktops?

The GPU portion of the AMD's CPU IMO in the mainstream will be no more powerful than a integrated GPU. Granted an integrated GPU in 2009 should be equal in performance to an ATI x1600. The big differance here is once buying a stand alone GPU the CPU can then use all those FPU's of the integrated GPU as its own. Currently your integrated GPU just goes to waste and in some cases was a major disadvantage. Remember the integrated GPU's getting OCed when a standlone GPU was installed.

The motherboards will have to be made for the new APU's and a new socket will be designed to accept the APU. The new motherboards will need no more than a video card in the way of displaying the APU's video. This will lead to some cost reductions and major reuse of the integrade GPU so im all for the change.

The history of the CPU has seen many such changes. The mid 80's was marked by the CPU and its co math processor and both now make up todays CPU. If you look back farther in the 70's it took 3 chips to make up the CPU's functions. In the future the CPU will have to take more of the motherboards funtions or be bottlenecked. Its just that simple.

The only thing that could stop the adoption of motherboard components in the CPU is if the motherboard moves to fiber optics or light spectrum fiber optics for bandwidth and latency. For this to occure the CPU must move to fiber octics or atleast be able to change Electric to light for transmiting. IBM is currently working on a storage structure for light which maybe needed for this evolution to occur.

The CPU will change and change but the one thing we can be sure of is AMD has already tested a simplicistic APU to prove to management its positive affects. Can AMD market the APU? I would say if anyone wants to buy a new computer after 2009 it will be an APU be it a Intel, AMD, or maybe even a Nvidia APU.
January 7, 2007 5:10:51 PM

It will be expensive for one thing.
January 8, 2007 10:45:26 AM

Elbert plz allow me to correct u , AMD Fusion will scale from the ultra low-end up to the midrange, just so that u can know.
January 8, 2007 3:32:30 PM

Quote:
According to Speed, Fusion will be targeted at mainstream and low-end computing - as long as graphics are concerned - initially


http://www.tgdaily.com/2006/10/25/fusion_details/

I do doubt that fusion will be limited to the mid and low-end, however. The number-crunching potential for a GPU-like architecture on a CPU is almost unfathomable, which would put fusion in a league of its own with scientific computing.
January 8, 2007 5:05:15 PM

Personally, I think we are entering the next era of the co-processor. There is only so much that multiple, general-purpose cores will be able to do efficiently. Eventually, managing cores will become the bottleneck.

Enter the co-processor. Stream processing is a good logical choice for a co-processor task these days. Give the co-processor lots of cache and independent access to main memory and you have something the programmers (applications, operating systems, drivers, etc.) can take advantage of. You could just offload a task to the subsystem and let it do it's job. Add in some synchronization logic and away you go. You now have a highly parallel, multi-core main processor with a streaming engine all on the same die. All of this can peacefully co-exist with GPUs and other off-chip extensions. Everybody wins.

This is by far most appealing to the mobile marketplace, since they could pack so much power into such a small place. Eventually they can remove the need for extra silicon for sound and video processing, for example. Imagine a 1080p HDTV capable cell phone or PDA.

The trick to all of this is being the one to drive the standards. Without standards this kind of stuff will never happen. If you set the standards then you set the pace for innovation. What I'm thinking is SIMD on steroids.
January 8, 2007 5:06:42 PM

It'll be a boon in the laptop sector thats for sure. But in my humble uninformed opinion, most likely Fusion will have Cell like yields.
January 8, 2007 5:08:35 PM

Hello Mr Ninja.

I see you lost your holiday cap. Home yet, or still enjoying St Lucia?
January 8, 2007 5:16:19 PM

Still enjoying the sunny weather. And how about you guys back in the states? I hear there is a heat wave in the northeast?
January 8, 2007 5:24:55 PM

Quote:
Still enjoying the sunny weather. And how about you guys back in the states? I hear there is a heat wave in the northeast?


Oh ya. Low 60s in upstate NY yesterday. I miss the snow, but I love the fact its driving crude oil prices down!!! 2-3 more weeks before gas prices should start dropping. But no snow :cry:  Im back to Fl today. I missed the snow :x
January 8, 2007 5:30:25 PM

Quote:
Elbert plz allow me to correct u , AMD Fusion will scale from the ultra low-end up to the midrange, just so that u can know.

Just so you know you should reply to my post when your speaking to me. Correcting me is a little hard as I never stated it wouldnt. I gave only 1 example in my last post of a fusion CPU which IMO in 2009 should be in the mainstream. If you where refering to a earlier post please quote. Work on how you present yourself in the forum. All I ask is you reply to the correct person and if possible quote.
January 8, 2007 7:28:23 PM

Quote:
Work on how you present yourself in the forum.


Jesus christ. You know, there are some people here who still amaze me.
January 8, 2007 8:08:20 PM

Quote:
Work on how you present yourself in the forum.


Jesus christ. You know, there are some people here who still amaze me.
I would ask how your day has been but if this amazed you it must have been bad.
January 9, 2007 10:21:12 AM

Quote:
Elbert plz allow me to correct u , AMD Fusion will scale from the ultra low-end up to the midrange, just so that u can know.

Just so you know you should reply to my post when your speaking to me. Correcting me is a little hard as I never stated it wouldnt. I gave only 1 example in my last post of a fusion CPU which IMO in 2009 should be in the mainstream. If you where refering to a earlier post please quote. Work on how you present yourself in the forum. All I ask is you reply to the correct person and if possible quote.

sorry, ill work on it.
January 9, 2007 1:44:52 PM

do u think it would be appropriate to use such a chip along with the CPU?, the CPU integrated in with the GPU can offload some tasks from the main CPU.
Another point is that the CPU wont be much of a bottleneck at high resolutions, read the article ' Geforce 8800 needs the fastest CPU' I remember it shows that at low resolutions, CPU might bottleneck but at high resolutions, AMD and intel procs perform almost the same.
January 9, 2007 2:56:48 PM

Quote:
do u think it would be appropriate to use such a chip along with the CPU?, the CPU integrated in with the GPU can offload some tasks from the main CPU.
Another point is that the CPU wont be much of a bottleneck at high resolutions, read the article ' Geforce 8800 needs the fastest CPU' I remember it shows that at low resolutions, CPU might bottleneck but at high resolutions, AMD and intel procs perform almost the same.


The DirectX API is CPU intensive, and while DirectX 10 takes some strides to fix this, most games are not written with DirectX 10 in mind right now (too new). In this case, it does not matter where the GPU sits, on the bus or in a CPU slot, it will still be limited by the speed of the CPU in many cases.

Each call to the OS or any API requires CPU time, and the more function calling you need to perform the more reliance on the CPU you have. It can easily take 1000's of API calls to render a single frame of a scene in a 3D application. This translates to a lot of CPU usage even though many of the API calls just pass information to the memory in the GPU. The fix for this requires a new architecture that reduces the need of the CPU. As I said, DirectX 10 does some of this but expect this to be an ongoing issue and the need for speed increases.
January 9, 2007 3:43:03 PM

Quote:
Elbert plz allow me to correct u , AMD Fusion will scale from the ultra low-end up to the midrange, just so that u can know.

Just so you know you should reply to my post when your speaking to me. Correcting me is a little hard as I never stated it wouldnt. I gave only 1 example in my last post of a fusion CPU which IMO in 2009 should be in the mainstream. If you where refering to a earlier post please quote. Work on how you present yourself in the forum. All I ask is you reply to the correct person and if possible quote.

sorry, ill work on it.
OK, I may have been a little harsh and for that im sorry. :( 
January 9, 2007 4:25:47 PM

It could be really slick if it works out. It'll be a hell of a lot cheaper to make one die at one fab than it will for AMD and ATi to be trying to manufacture seperate dies and connect them across a bus. The issues I forsee are die size, cooling and memory connections.

For die size, you've essentially got a quad cpu with half of the area being for GPU use. The 65nm process reduction for the GPU side will help with that a lot (since they're still on 80nm or 90nm, I forget which). But its still going to be big and cooling is going to be a pain. Water cooling is probably going to become the norm at that point.

Hooking it up to the memory is going to be trickier. I could honestly see them using a PCI-X16 slot for a riser card and some fast DIMMs. Regular DDR isn't going to cut it, though most tests have shown that the core clock is more inhibitive than the memory clock for graphics. Future revisions of hypertransport will probably take care of this.

You just trade upgradability around a bit. You have to get a new CPU if you want a new GPU, but it'll be cheaper than buying both seperately and most people will be fine with that. Plus you can crank up the video ram if you're running a little low, independant of the graphics core.