Thoughts on GPGPU?

sgtmattbaker

Distinguished
Aug 21, 2009
97
0
18,630
What do you think about GPGPU? It sounds to be like it is going to be pretty huge for anyone who does anything multimedia related or scientifically related. Do you think it will be good for pure number crunching or does it have to have a multimedia component (images, etc.)? Aside from the special hardware (say for h.264 playback) on video cards, if you had a processor that had a bunch of cores, wouldn't it basically be the same thing, assuming the code is written to actually utilize all the cores? Why would people start to develop for GPGPU if they aren't currently coding a lot of programs to allow the use of multiple cores? What would make them change and start multi-threading applications?

http://gizmodo.com/5252545/giz-explains-gpgpu-computing-and-why-itll-melt-your-face-off
That article shows a bar graph of the capabilities of GPGPU. I did not see a source, but do you think it is accurate?

Does anyone have an idea of what the standard will be for GPGPU? From the limited amount of information I have been exposed to it seems like OpenCL is going to be big for most everyone and then Microsoft is going to refuse to get along and use DirectX 11. I don't know what is going to happen with CUDA since nvidia was part of the openCL group.
 

LePhuronn

Distinguished
Apr 20, 2007
1,950
0
19,960
Right now I think GPGPU as a whole is a solution looking for a problem, but already there are a few little bits and pieces that are proving to be very, very useful for me, specifically CUDA-based video encoders and media playback.

In time when the approaches to GPGPU are standardised, or a single API/development language can be compiled down to any GPGPU hardware, we'll see a lot more of it about with "proper" uses, not specialist or gimmicky (but nonetheless worthwhile) apps like Folding@Home and H.264 video encoders.

I think there's more potential in multi-threaded applications than multi-core applications as it appears relatively more straightforward to increase the number of stream processors in a GPU than it is cores in a CPU.
 

sgtmattbaker

Distinguished
Aug 21, 2009
97
0
18,630
I can see what you are talking about. However, I think it is cool that they are actually laying out tech for applications to catch up with. It is better than the other way around in my opinion. Think of what this can do for science if they have their supercomputers using this stuff.
 

TRENDING THREADS