Sign in with
Sign up | Sign in

Nvidia's CUDA is Already 5 Years Old

By - Source: Nvidia | B 23 comments

Nvidia's CUDA, Compute Unified Device Architecture, just turned five years old in a relatively quite ceremony held at SC11 and on Nvidia's blog.

The platform, which enables developers to exploit Nvidia GPUs (as well as x86 CPUs) for general-purpose GPU computing purposes was introduced on November 15, 2006 with the GeForce 8 series. Since then, Nvidia claims to have sold more than 350 million CUDA enabled GPUs. The CUDA toolkit has been downloaded more than 1 million times and more than 500 universities around the globe are teaching CUDA classes.

CUDA was, from the very beginning, designed to drive GPUs into high-performance computing applications in military, academic and industrial environments. While it was somewhat slow to start, Nvidia has been successful as, for example, three of the five fastest supercomputers in the world now integrate Tesla acceleration cards, the primary delivery vehicle for CUDA-based accelerators. CUDA apps, which are basically created via C++ like-code with specific extensions was the first generally available high-level language to easily access the processing horsepower in widely available and relatively affordable GPUs.

CUDA, which is still positioned against open high-level platforms, especially OpenCL, survived a looming battle with Intel's canceled Larrabee graphics card and floating point accelerator, but has been frequently criticized that it is not as easy to deploy as Nvidia claims. For example, while basic access to the GPU via CUDA is considered to be relatively easy, the remaining 5 to 10 percent of performance that is hidden in a GPU can only be accessed via detailed knowledge of the architecture of the GPU, especially its memory architecture.

In June of this year, Nvidia rolled out, with a delay of more than two years, multi-CPU x86 CUDA compilers that runs CUDA code on Intel and AMD processors.

Display 23 Comments.
This thread is closed for comments
Top Comments
  • 12 Hide
    AbdullahG , November 19, 2011 7:37 PM
    plznoteHasn't improved much, has it?

    From Nvidia:
    Quote:
    Heart attacks are the leading cause of death worldwide. Harvard Engineering, Harvard Medical School and Brigham & Women's Hospital have teamed up to use GPUs to simulate blood flow and identify hidden arterial plaque without invasive imaging techniques or exploratory surgery.


    Quote:
    The National Airspace System manages the nationwide coordination of air traffic flow. Computer models help identify new ways to alleviate congestion and keep airplane traffic moving efficiently. Using the computational power of GPUs, a team at NASA obtained a large performance gain, reducing analysis time from ten minutes to three seconds.


    Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...

Other Comments
  • 12 Hide
    AbdullahG , November 19, 2011 7:37 PM
    plznoteHasn't improved much, has it?

    From Nvidia:
    Quote:
    Heart attacks are the leading cause of death worldwide. Harvard Engineering, Harvard Medical School and Brigham & Women's Hospital have teamed up to use GPUs to simulate blood flow and identify hidden arterial plaque without invasive imaging techniques or exploratory surgery.


    Quote:
    The National Airspace System manages the nationwide coordination of air traffic flow. Computer models help identify new ways to alleviate congestion and keep airplane traffic moving efficiently. Using the computational power of GPUs, a team at NASA obtained a large performance gain, reducing analysis time from ten minutes to three seconds.


    Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...

  • 4 Hide
    plznote , November 19, 2011 7:40 PM
    You missed my point.
    I didn't say that it was useless.
    I said that it hasn't improved much.(Has'nt got better)
  • 2 Hide
    AbdullahG , November 19, 2011 7:59 PM
    My bad. It was just phrased in a way saying CUDA has not improved much, didn't know you meant CUDA itself hasn't improved...
  • 2 Hide
    clonazepam , November 19, 2011 9:29 PM
    I wish CUDA could help me quit smoking. Other than that, I think its a great platform.
  • 1 Hide
    badmilk69 , November 20, 2011 12:27 AM
    Just two words: Octane Render.
  • 2 Hide
    TheGuardian , November 20, 2011 12:31 AM
    Cuda also makes it possible for smaller companies or individuals to pump out higher quality animations. Using Cycles and only one of my 570s, I can render a frame in about 1/10th the time. A scene that would have taken 10hrs to render, now only takes an hour.
  • 0 Hide
    Onikage , November 20, 2011 1:46 AM
    yeah it all happend with the 8000 series, have one 8800GTS 512 from MSI in 2008 and it still works,
  • -4 Hide
    alidan , November 20, 2011 2:07 AM
    id still rather have an open solution to cuda than cuda... nvidia has wronged me in the past and i will never for give them...
  • 2 Hide
    lordstormdragon , November 20, 2011 5:59 AM
    alidanid still rather have an open solution to cuda than cuda... nvidia has wronged me in the past and i will never for give them...


    They haven't personally wronged you, Alidan. Quit your crying.

    There already are several open solutions. OpenCL for one is coming along nicely. But this requires that the hardware manufacturers meet THEIR specs. With CUDA, the software meets Nvidia's specs.

    There are simply things OpenCL cannot do that CUDA can. And OpenCL is excellent of course, but AMD cards and Nvidia cards are simply built differently. It's apples and oranges, when it comes to GPU compute performance. Gaming it would be apples and apples, but raw computations are a different monster...

    If you've ever used iRay (mental images) or Vray-RT for GPU rendering, you'd know the difference. Since you haven't, it'll give you a point to research.

  • 1 Hide
    hetneo , November 20, 2011 7:35 AM
    My Cuda is 45 years old, Nvidia is lagging behind XD
  • 1 Hide
    mark0718 , November 20, 2011 9:43 AM
    "the remaining 5 to 10 percent of performance that is hidden in a GPU can only be accessed via detailed knowledge of the architecture of the GPU, especially its memory architecture."

    Since when is 10% a big deal?
  • 5 Hide
    mike fe , November 20, 2011 10:45 AM
    "in a relatively quite ceremony"

    Shouldn't that be quiet instead of quite?
  • 2 Hide
    danwat1234 , November 20, 2011 11:21 AM
    Running CUDA on Seti@home and Folding@home. It has proven to be great for scientific and engineering work
  • 2 Hide
    upgrade_1977 , November 20, 2011 5:01 PM
    10% vs 200 to 2000% increase in performance over CPU's isn't a bad trade off..
  • 3 Hide
    CaedenV , November 20, 2011 5:07 PM
    quite a difference in Adobe's mercury engine between CUDA being on and off!
  • -1 Hide
    youssef 2010 , November 20, 2011 5:23 PM
    but disabling PhysX when a non-Nvidia GPU is present is a cheap move. Glad we have someone like GenL to fix it
  • 1 Hide
    _Cubase_ , November 20, 2011 9:22 PM
    caedenvquite a difference in Adobe's mercury engine between CUDA being on and off!


    Amen to that! CUDA has been a dream for us video editors using Premiere! Not needing to render much (if at all) has enabled my CUDA cards to pay for themselves over and over.
  • 0 Hide
    psp09 , November 21, 2011 3:27 AM
    Happy birthday to you CUDA !!!
  • 1 Hide
    bystander , November 21, 2011 3:36 AM
    AbdullahGFrom Nvidia: Adobe, Microsoft, and others use CUDA in several of their apps. CUDA also comes in handy in game and graphics development. The benefits also go beyond that. Just because "games" and such don't take advantage of CUDA doesn't mean it hasn't helped...


    I'm pretty certain that PhysX uses CUDA, so in reality, games do take advantage of it.
Display more comments