Nvidia GPUs Speed Up Win 7 & Leopard
Nvidia Tesla product manager Sumit Gupta said their GPUs speed up Windows 7 and Apple's "Snow Leopard" because they're cool like that.
Last week we reported that Nvidia released its OpenCL driver and software development kit (SDK) for those enrolled in the OpenCL Early Access Program. The company released the driver early to receive feedback before distributing the beta. Nvidia said that the new driver would run on the CUDA architecture, enabling it to take advantage of the GPU's parallel computing. However, Sumit Gupta, product manager for Nvidia's Tesla products, went into more detail in an interview Friday, explaining how the Nvidia GPUs will accelerate software in Windows 7 and Apple's OS X Snow Leopard operating systems.
"The really interesting thing about OpenCL and DirectX is that OpenCL is going to form part of the Apple operating system (Snow Leopard) and DirectX (version 11) will form part of Windows 7," Gupta told CNET. "And what that essentially means to consumers is, if your laptop has an Nvidia GPU or ATI (AMD) GPU, it will run the operating system faster because the operating system will essentially see two processors in the system. For the first time, the operating system is going to see the GPU both as a graphics chip and as a compute engine," he said. Additionally, consumers using Windows 7 will see the GPU as a CPU in Task Manager.
But aren't GPUs meant for rendering graphics? Primarily, yes, however they've taken on a new responsibility within recent years as technology has improved, enabling them to help tackle more general computing processes usually handled by the system's main processor. Consider the GPU as a "helper" now, offering up its higher-end processing areas to compute a portion of a task carried out by the CPU, allowing both to work "in concert" rather than separate entities. This parallel processing will actually speed up both operating systems, however the benefit isn't a holy grail provided by Nvidia alone: ATI GPUs also provide a general purpose environment as well.
According to AMD, its ATI Stream technology is defined as a set of advanced hardware and software technologies that enable AMD graphics processors to work along with the system's CPU, working in parallel, to accelerate many applications beyond just graphics; Nvidia's CUDA works in the same manner. In addition, Nvidia's CUDA is compatible with many computational interfaces including PhysX, Java, Python, OpenCL, and DirectX Compute; ATI's Stream is compatible with DirectX, Havok's physics middleware, and OpenCL.
So the question is this: if GPUs are taking on the role of general processing (in addition to graphics processing), are CPU's on their way out? No. "If you're running an unpredictable task, the CPU is the jack of all trades," Gupta said. "It is really good at these unpredictable tasks. The GPU is a master of one task. And that is a highly parallel task."
He also goes on to describe a scenario of how the CPU and GPU would work together. When a consumer launches Google Picasa, the program would run entirely on the CPU. However, once the consumer loads up an image and applies a filter, the filter aspect should be run on the GPU. "The CPU is one aspect but not necessarily the most important aspect anymore," he said.
The bad news for AMD and Nvidia is that Intel is taking notice to the general computing environment, and plans to release a graphics chip that will handle parallel computing as well. "Since the graphics pipeline is becoming more and more programmable, the graphics workload is making its way to be more and more suited to general purpose computing--something the Intel Architecture excels at and Larrabee will feature," an Intel spokesperson told CNET.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
There's quite a lot to look forward to with the release of Windows 7 and Apple's OS X Snow Leopard, especially if the parallel computing indeed does speed up the processing of applications.
-
blarneypete So will the GPU be relegated to things like Photoshop filters, or will it be able to help crunch other things - like encoding an H.264/AVC video?Reply -
A Stoner I have been wondering why I never have access to see exactly how much of my GPU is being taxed. I hope this means I will be able to do so in the future.Reply -
I wonder what the effect would be if we ran both Nvidia and ATI high end graphics cards in a system?Reply
Which card would we connect the monitor to?
Would the system use both GPUs for processing?
It'd be kind of interesting to see what the results would be. Too bad we couldn't connect the cards together as you can with Crossfire or SLI. -
boomhowar A StonerI have been wondering why I never have access to see exactly how much of my GPU is being taxed. I hope this means I will be able to do so in the future.Reply
gpu-z will show your gpu load under "sensors" tab. it does for my 4850. -
jsloan Renegade_WarriorI wonder what the effect would be if we ran both Nvidia and ATI high end graphics cards in a system?Which card would we connect the monitor to?Would the system use both GPUs for processing?It'd be kind of interesting to see what the results would be. Too bad we couldn't connect the cards together as you can with Crossfire or SLI.Reply
not really, the gpu is not running x86 instruction set, but for certain tasks the software developer could have specially coded code that if the gpu are around it could offload those computations to. i recently wrote some code that does just that, microsoft and nvidia and ati make it very easy to do so. if there is not gpu then your cpu is stuck with the task, but if there is then it get it and the cpu is free to do other things. this will allow for some amazing windows eye candy tweaks, much like we see in games being available to the desktop programs rendering. i did a simple texture filter, and wow it worked, microsoft visual studio made is soo easy to do, just simple function. think of water, cload, smoke, ect effects and windows. wow -
Snow Leopard is a pretty shitty name I can't believe apple is running with it.Reply
Hahah well, there is a real leopard that is called a Snow Leopard, it's also known as an Ounce. The Toronto zoo had one when I was there last.
Now for my self I find the name Twitter pretty bad hahahah
-
computabug Cool, so does that mean theoretically I only need either a gpu or cpu to boot up and use the most basic funtions of my computer? :o Well there are the physical limitations like no vga/dvi port...Reply