NVIDIA Enables Data Processing on the GPU

exit2dos

Distinguished
Jul 16, 2006
2,646
0
20,810
GPU computing with CUDA is a new approach to computing where hundreds of on-chip processor cores simultaneously communicate and cooperate to solve complex computing problems up to 100 times faster than traditional approaches. This breakthrough architecture is complemented by another first-the NVIDIA C-compiler for the GPU. This complete development environment gives developers the tools they need to solve new problems in computation-intensive applications such as product design, data analysis, technical computing, and game physics.
http://www.nvidia.com/object/IO_37226.html

Also,
it's actually built from the ground up as a highly multithreaded, general-purpose stream processor, with the GPU functionality layered over it in software. This is the reverse of existing general-purpose GPU (GPGPU) approaches. So with the G80, a programmer can write a stream program in a regular high-level language (HLL) that compiles directly to the stream processor, without the additional overhead that goes along with translating HLL programs into a graphics-specific language like OpenGL's GLSL.
http://arstechnica.com/news.ars/post/20061108-8182.html


For info on "Stream Processing", check here:
http://arstechnica.com/news.ars/post/20060918-7763.html
Stream processing is quite similar to SIMD processing, but where SIMD processors use single instructions to operate on vectors, stream processors use kernels to operate on streams.

An input stream is an array of data elements that can be operated on in parallel. Input streams are fed into a stream processor one stream at a time, where they're operated on by collections of instructions called kernels. A kernels is a sequence of instructions that are to be applied to each element in a stream. Thus a kernel function acts like a small loop that iterates once once for each stream element.
 
G

Guest

Guest
The major breakthrough IMO is that you can actually use C code, thats is uber sweet and people WILL use it, because any programmer did at least a little C at some point.

It's pretty cool and I think it has the potential to put the Cell processor to shame in some key area. Especially with it's (probable) ease of use...
 

exit2dos

Distinguished
Jul 16, 2006
2,646
0
20,810
Would be nice if the OS could be designed to use the GPU processing power generically. Could really cost-justify a SLI rig. But, I guess the CPU and GPU will merge by the time that happens.
 
G

Guest

Guest
Well GPU isn't that much multi-pupose/general yet; heavily branched code will still run faster on CPU because they're designed to be multipurpose and to do anything you throw at them.

The GPU could really speed up encoding/simulation/heavily parallel not branched stuff.

Still it would be great if as you say, those application where tightly integrated and that the load would just be balanced between the CPU and GPU directly thru drivers/OS thread scheduler, whatever. So portion of the code that is sequential would run on the CPU and the heavy dutty parralel code would be thrown directly on the GPU.

Like you said SLI/TRI-sli/quad-sli could come in handy for some!
 

papi4baby

Distinguished
Nov 1, 2006
215
0
18,680
I'm suprised that Sony didn't make Nvidia hold off until 6 month - 1 year after the PS3 is released. Like MS did with ATI.

What the heck are you talking about the G80 and the RSX on the PS3 have nothing common except they were both design by Nvidia. The PS3 GPU is based more on the G70.
 
G

Guest

Guest
It comes from the following question(more or less accurate):

With so much power in the RSX and cell processor, when do you think the PC will catch up with the PS3.

The answer came from the G80, 1 week before the lunch of the PS3!

I bet PS3 will make ton of money for nVidia, so sony could always pressure them to wait a bit so the PS3 is more attractive at lunch(with the huge price tag). I dont think MS did that with ATI though

I always find that funny how console fans praise the huge specs of the hardware and how it doesn't stay on top for more then umm 3-6 month =).
 

shinigamiX

Distinguished
Jan 8, 2006
1,107
0
19,280
This time it stayed ahead for -1 week:p
But hey the PS3 premium system costs about the same as a 8800GTX by itself, so it's up to user preference I guess...
 

Doughbuy

Distinguished
Jul 25, 2006
2,079
0
19,780
Yeah, I hate console people who think they're games look just as good as computer versions... I'm looking forward to handing some XBOX360 players their ass in some FPS games...

The only code that would be worht running on a GPU is anything that takes advantage of all the floating point calculations that a GPU excels at, i.e. scientific uses (folding...) Not much practical use as of now, but its a step in the right direction.
 

Frozen_Fallout

Distinguished
Jan 8, 2003
433
0
18,780
This time it stayed ahead for -1 week:p
But hey the PS3 premium system costs about the same as a 8800GTX by itself, so it's up to user preference I guess...

One thing though is that consoles don't depend of great hardware as much as programming for the console it self. Look at the Xbox when it came out. It had crappy hardware comparitivly however it did run Doom 3. Now most games look better on PC but consoles are good for just gaming and nothing else. I personly don't like consoles as much but I also don't game as much as I do other stuff on my computer.
 

shinigamiX

Distinguished
Jan 8, 2006
1,107
0
19,280
Yea, you're perfectly right - the PS2 is also a great example: made in 2000, with a 294MHz CPU and 32MB of RAM, among other antiquated specifications, it can still produce aesthetically pleasing graphics. I bet a PC from the same period wouldn't even come close.
I still like PCs better though.
 
G

Guest

Guest
Is the next user title at 2750 Post?

Lets see, and oh this post belongs in the GPGPU section! DUH!

About console hardware is not that important as they extract every single ounce of performance because they dont have to work on interpolarity so they work on efficiency. And all the argument about console: better value, good content: all true, just dont come and say that PC gaming is dead and that console are the Uber new hardware and look better. For christ sake you're playing on a television, it CANT look better, it can look bigger though
 

Frozen_Fallout

Distinguished
Jan 8, 2003
433
0
18,780
Yea, you're perfectly right - the PS2 is also a great example: made in 2000, with a 294MHz CPU and 32MB of RAM, among other antiquated specifications, it can still produce aesthetically pleasing graphics. I bet a PC from the same period wouldn't even come close.
I still like PCs better though.

Yeah I remember when FF X came out the graphics where sooo good. But computers will always be able to bump up the graphics if you have the hardware.
 
G

Guest

Guest
Malloc? anyone?

C might be for Crap, who else then programmer did C?(if it doesnt stand for crap)
 
:oops: I’m sorry, being from an area where ‘doing a little C’ has different connotations none of which have anything to do with programming, I shall take my infantile attempts at humour with me and go and lurk elsewhere.
 
G

Guest

Guest
Well had a multithreading class, the code had to be Pure C and we compiled it on some Unix Server with 16 procs...

Anyhow C is really similar syntax, just can't use those new, you have to allocate the memory yourself.
Same stuff just low level no OO.