Sign in with
Sign up | Sign in

Nvidia GPUs Speed Up Win 7 & Leopard

By - Source: Tom's Hardware US | B 30 comments

Nvidia Tesla product manager Sumit Gupta said their GPUs speed up Windows 7 and Apple's "Snow Leopard" because they're cool like that.

Last week we reported that Nvidia released its OpenCL driver and software development kit (SDK) for those enrolled in the OpenCL Early Access Program. The company released the driver early to receive feedback before distributing the beta. Nvidia said that the new driver would run on the CUDA architecture, enabling it to take advantage of the GPU's parallel computing. However, Sumit Gupta, product manager for Nvidia's Tesla products, went into more detail in an interview Friday, explaining how the Nvidia GPUs will accelerate software in Windows 7 and Apple's OS X Snow Leopard operating systems.

"The really interesting thing about OpenCL and DirectX is that OpenCL is going to form part of the Apple operating system (Snow Leopard) and DirectX (version 11) will form part of Windows 7," Gupta told CNET. "And what that essentially means to consumers is, if your laptop has an Nvidia GPU or ATI (AMD) GPU, it will run the operating system faster because the operating system will essentially see two processors in the system. For the first time, the operating system is going to see the GPU both as a graphics chip and as a compute engine," he said. Additionally, consumers using Windows 7 will see the GPU as a CPU in Task Manager.

But aren't GPUs meant for rendering graphics? Primarily, yes, however they've taken on a new responsibility within recent years as technology has improved, enabling them to help tackle more general computing processes usually handled by the system's main processor. Consider the GPU as a "helper" now, offering up its higher-end processing areas to compute a portion of a task carried out by the CPU, allowing both to work "in concert" rather than separate entities. This parallel processing will actually speed up both operating systems, however the benefit isn't a holy grail provided by Nvidia alone: ATI GPUs also provide a general purpose environment as well.

According to AMD, its ATI Stream technology is defined as a set of advanced hardware and software technologies that enable AMD graphics processors to work along with the system's CPU, working in parallel, to accelerate many applications beyond just graphics; Nvidia's CUDA works in the same manner. In addition, Nvidia's CUDA is compatible with many computational interfaces including PhysX, Java, Python, OpenCL, and DirectX Compute; ATI's Stream is compatible with DirectX, Havok's physics middleware, and OpenCL.

So the question is this: if GPUs are taking on the role of general processing (in addition to graphics processing), are CPU's on their way out? No. "If you're running an unpredictable task, the CPU is the jack of all trades," Gupta said. "It is really good at these unpredictable tasks. The GPU is a master of one task. And that is a highly parallel task."

He also goes on to describe a scenario of how the CPU and GPU would work together. When a consumer launches Google Picasa, the program would run entirely on the CPU. However, once the consumer loads up an image and applies a filter, the filter aspect should be run on the GPU. "The CPU is one aspect but not necessarily the most important aspect anymore," he said.

The bad news for AMD and Nvidia is that Intel is taking notice to the general computing environment, and plans to release a graphics chip that will handle parallel computing as well. "Since the graphics pipeline is becoming more and more programmable, the graphics workload is making its way to be more and more suited to general purpose computing--something the Intel Architecture excels at and Larrabee will feature," an Intel spokesperson told CNET.

There's quite a lot to look forward to with the release of Windows 7 and Apple's OS X Snow Leopard, especially if the parallel computing indeed does speed up the processing of applications.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 12 Hide
    apache_lives , April 27, 2009 11:21 PM
    This news just in: Ram and CPU power also speed up PC's and Macs too! who knew!
Other Comments
  • 0 Hide
    blarneypete , April 27, 2009 10:51 PM
    So will the GPU be relegated to things like Photoshop filters, or will it be able to help crunch other things - like encoding an H.264/AVC video?
  • 8 Hide
    crisisavatar , April 27, 2009 10:59 PM
    Snow Leopard is a pretty shitty name I can't believe apple is running with it.
  • Display all 30 comments.
  • 6 Hide
    A Stoner , April 27, 2009 11:08 PM
    I have been wondering why I never have access to see exactly how much of my GPU is being taxed. I hope this means I will be able to do so in the future.
  • 12 Hide
    apache_lives , April 27, 2009 11:21 PM
    This news just in: Ram and CPU power also speed up PC's and Macs too! who knew!
  • -2 Hide
    Anonymous , April 27, 2009 11:28 PM
    I wonder what the effect would be if we ran both Nvidia and ATI high end graphics cards in a system?

    Which card would we connect the monitor to?

    Would the system use both GPUs for processing?

    It'd be kind of interesting to see what the results would be. Too bad we couldn't connect the cards together as you can with Crossfire or SLI.
  • 7 Hide
    boomhowar , April 27, 2009 11:48 PM
    A StonerI have been wondering why I never have access to see exactly how much of my GPU is being taxed. I hope this means I will be able to do so in the future.


    gpu-z will show your gpu load under "sensors" tab. it does for my 4850.
  • -5 Hide
    jsloan , April 27, 2009 11:54 PM
    wow!
  • 1 Hide
    jsloan , April 28, 2009 12:00 AM
    Renegade_WarriorI wonder what the effect would be if we ran both Nvidia and ATI high end graphics cards in a system?Which card would we connect the monitor to?Would the system use both GPUs for processing?It'd be kind of interesting to see what the results would be. Too bad we couldn't connect the cards together as you can with Crossfire or SLI.


    not really, the gpu is not running x86 instruction set, but for certain tasks the software developer could have specially coded code that if the gpu are around it could offload those computations to. i recently wrote some code that does just that, microsoft and nvidia and ati make it very easy to do so. if there is not gpu then your cpu is stuck with the task, but if there is then it get it and the cpu is free to do other things. this will allow for some amazing windows eye candy tweaks, much like we see in games being available to the desktop programs rendering. i did a simple texture filter, and wow it worked, microsoft visual studio made is soo easy to do, just simple function. think of water, cload, smoke, ect effects and windows. wow
  • 1 Hide
    Anonymous , April 28, 2009 12:50 AM
    Snow Leopard is a pretty shitty name I can't believe apple is running with it.

    Hahah well, there is a real leopard that is called a Snow Leopard, it's also known as an Ounce. The Toronto zoo had one when I was there last.

    Now for my self I find the name Twitter pretty bad hahahah
  • -7 Hide
    computabug , April 28, 2009 1:30 AM
    Cool, so does that mean theoretically I only need either a gpu or cpu to boot up and use the most basic funtions of my computer? :o  Well there are the physical limitations like no vga/dvi port...
  • 7 Hide
    mbbs20 , April 28, 2009 2:48 AM
    ok so if ATI can also take use of it then y is the title talks only about Nvidia
  • 2 Hide
    Anonymous , April 28, 2009 3:11 AM
    is this going to be limited to cards specifically made for DirectX 11? Is my GTX 260 SLI setup going to be unable to take advantage of parallel processing?
  • 0 Hide
    sneakypete , April 28, 2009 4:10 AM
    crisisavatarSnow Leopard is a pretty shitty name I can't believe apple is running with it.

    Yeah, and Windows 7 is so brilliant. Who would have thunk?
  • 5 Hide
    the_one111 , April 28, 2009 4:31 AM
    sneakypeteYeah, and Windows 7 is so brilliant. Who would have thunk?

    At least Windows 7 doesn't sound like it wants to kill you...
  • 2 Hide
    DXRick , April 28, 2009 5:21 AM
    blarneypeteSo will the GPU be relegated to things like Photoshop filters, or will it be able to help crunch other things - like encoding an H.264/AVC video?


    Photoshop CS4 already uses CUDA (for some Nvidia cards) for some filters.
  • 0 Hide
    IronRyan21 , April 28, 2009 7:28 AM
    Quote:
    Snow Leopard is a pretty shitty name I can't believe apple is running with it.


    ROFL True!
  • -1 Hide
    amnotanoobie , April 28, 2009 10:59 AM
    @ok so if ATI can also take use of it then y is the title talks only about Nvidia
  • 0 Hide
    amnotanoobie , April 28, 2009 11:03 AM
    mbbs20ok so if ATI can also take use of it then y is the title talks only about Nvidia


    It's because ATi hasn't released something for OpenCL, yet. Anyone who has seen how to even initialize CAL would see that it isn't really ready for primetime yet, it needs a bit more polishing (and ease of use!). From what I read elsewhere, OpenCL is almost based on CUDA (or the code appears almost like it).
  • 0 Hide
    gamerk316 , April 28, 2009 11:58 AM
    GPU's have a major advantage when it comes to anything having to do with math, thanks in part to a wider data bus (ie: a 256bit data bus for a GPU with 256bit registers can load 8 32-bit integers in a single operation, while a 64bit CPU can load only 2 32-bit integers in a single operation). Hence why PhysX runs so much faster on a GPU then on a CPU.

    I wonder though, if dual cards will suddenly lead to a noticable performance increase in general computing as a result...
  • 0 Hide
    Fadamor , April 28, 2009 1:22 PM
    mbbs20ok so if ATI can also take use of it then y is the title talks only about Nvidia

    Heh I was wondering when an AMD afficianado would complain. Surprisingly, I had to scroll pretty far down the list to find the first one. Even the Nvidia rep quoted in the article says both Nvidia AND AMD products are going to be able to do this.

    Who cares what the title said? You read the article so you got all the information, right?
Display more comments