Sign in with
Sign up | Sign in
Your question

CUDA to kill x86

Last response: in Systems
Share
June 19, 2008 6:09:43 PM

probably not the right place to put this, but O well,

does it look to you like it does to me that we are getting ready to see the end of x86?

CUDA has much more processing power cheaply.

CUDA is I believe and open language.

I read the CUDA can do C+, what all the games are written in.

what is to keep someone from coming up with a good CUDA OS, (maybe Apple if they are smart) and leave MS and Intel eating dust?

More about : cuda kill x86

June 19, 2008 6:17:02 PM

In your wet dream
June 19, 2008 6:23:01 PM

pulasky said:
In your wet dream


you seem to be over estimating how excited I get about computer architecture.
Related resources
June 19, 2008 6:27:09 PM

Kill it...
No.

Move towards merging a CPU and GPU...
Yes.

The massively simple cored GPU is great for many number crunching tasks but it will never replace the conventional CPU. I would predict that, as technology progresses, we will see either a GPU with an integrated CPU or vice versa. The CPU is to multi-purpose to kill off completely.
June 19, 2008 8:46:20 PM

outlw6669 said:
Kill it...
No.

Move towards merging a CPU and GPU...
Yes.

The massively simple cored GPU is great for many number crunching tasks but it will never replace the conventional CPU. I would predict that, as technology progresses, we will see either a GPU with an integrated CPU or vice versa. The CPU is to multi-purpose to kill off completely.


Intel did this in 1999. No one was interested so they canceled the project. It was CPU, GPU, and memory controller on the same die.
June 20, 2008 3:55:28 AM

shadowduck said:
Intel did this in 1999. No one was interested so they canceled the project. It was CPU, GPU, and memory controller on the same die.


That may very well be so but I doubt the GPU architecture of the time would allow for the efficient number crunching that makes the current generations so attractive. I am by no means a true expert in the matter but I believe a current generation blend on a chip of CPU and GPU could provide great benefits in the ability of a system to power thru simple tasks. This would probably be especially useful in the supercomputer arena.
June 20, 2008 3:57:21 AM

Yes, it could back then too. The price was obviously just much higher.


This is the main reason AMD bought ATI. They are already working on a CPU with onboard IGP. Other option they are working on is extending Hybrid Crossfire from IGP on the motherboard + video card to CPU IGP + video card.
June 20, 2008 4:15:39 AM

Quote:
They are already working on a CPU with onboard IGP. Other option they are working on is extending Hybrid Crossfire from IGP on the motherboard + video card to CPU IGP + video card.


Now this will be very interesting. I can see immediate implications in regards to their portable platforms with this approach. It will be interesting to see how they could manage to implement a platform whose entire video subsystem is contained on the CPU.

It would also be interesting to see the possibilities on a system with 3 general purpose CPU cores and a dedicated stream processing node. If there can be a unified instruction set for said stream processes that can quickly crunch masses of data I could see great increases in general computing power.

And finally...
Who doesn't need all the extra power they can get, Hybrid CrossFire or not, when trying to get Crysis on Very High ;) 
June 20, 2008 9:46:38 AM

err, why do they have to have the GPU on the same die as the CPU? surely that would increase temperatures ridiculously... just have a number of cards with CPU's and GPU's? chuck a bunch of fans between em and...voila! supercomputer :D 
June 21, 2008 12:55:58 AM

The other thing they are doing is changing the 4x4 interface (remember 2 x 2 dual core for one quad core with Athlon FX) by licensing the design of this socket out.

Current options in develop are socket GPUs, socket co-processors for math applications, 3D Video, FPU and a few other things.

So you might have a dual/tri socket system- one socket houses a quad core CPU, the other a GPU, the first a dedicated FPU for number crunching.
June 21, 2008 1:00:56 AM

shadowduck said:
The other thing they are doing is changing the 4x4 interface (remember 2 x 2 dual core for one quad core with Athlon FX) by licensing the design of this socket out.

Current options in develop are socket GPUs, socket co-processors for math applications, 3D Video, FPU and a few other things.

So you might have a dual/tri socket system- one socket houses a quad core CPU, the other a GPU, the first a dedicated FPU for number crunching.

Makes sense. The average joe doesn't need much number crunching, or even a decent gpu, for that matter. All in one package just doesn't work in reality.
June 21, 2008 1:03:27 AM

As another poster pointed out- the embedded space is where this tech is going to realize its full value.

However, for a value system, dual socket 1 CPU 1 GPU might make a lot of sense.
June 21, 2008 1:04:48 AM

shadowduck said:
As another poster pointed out- the embedded space is where this tech is going to realize its full value.

However, for a value system, dual socket 1 CPU 1 GPU might make a lot of sense.

Isn't that basically the same as it is now? :p 
June 21, 2008 1:06:10 AM

Almost- except instead of IGP the video card would be moved to a socket with on die RAM. System RAM would not be touched.
June 21, 2008 4:52:09 PM

so what we'd have would be something like a 5x socket F motherboard with 1 basic CPU and up to 4 GPUs and just 2 or 4 memory slots.

what you're saying is the basic functions will ba carried out by the CPU and all the advanved processes will be done by the GPUs.

but my basic premis is:
it would seam that the basic functions could be emulated by the GPUs.
like I said, I'm pretty sure CUDA can run C+, and virtualy everything is programmed in C+ these days.
June 21, 2008 4:52:53 PM

double post
June 22, 2008 12:45:05 AM

Yes that's correct groo.
!