Skip to main content

Nvidia to Hit the x86 CPUs With CUDA Capability

We've heard rumors that Nvidia been dipping its toe into the x86 CPU market, and today the graphics company made an announcement related to that – but it's not what you think.

Nvidia CEO Jen-Hsun Huang revealed that the company will bring its CUDA programming language to "any computer, or any server in the world," with the help of the Portland Group (PGI).

Specifically, this means that systems without Nvidia GPUs will be able to process CUDA code, giving the company its answer to Microsoft's DirectCompute and the more open OpenCL.

Nvidia says that its CUDA without a GPU will run best on multicore CPUs and will be ideal for servers.

(Source: Electronista.)

Marcus Yam
Marcus Yam served as Tom's Hardware News Director during 2008-2014. He entered tech media in the late 90s and fondly remembers the days when an overclocked Celeron 300A and Voodoo2 SLI comprised a gaming rig with the ultimate street cred.
  • bmadd
    Whats the point of CUDA without a GPU? Multi core or not.
    Reply
  • joytech22
    Whats the point of CUDA without a GPU? Multi core or not.

    Well now you can run CUDA code WITHOUT the need for a Nvidia graphics card and will also allow them to compete with OpenCL and Directcompute.

    And maybe it's cheaper to use what you have already (example in this case, a Supercomputer with say.. 100 CPU's), and would be cheaper to simply use those 100 CPU's instead of spending more cash on GPU's.

    it's all in the text.
    Reply
  • liveonc
    They've got the best hardware, now they need the whole software community to be able to lend a helping hand. It's okay, they plan to have a GPU that can run without a CPU two steps ahead. A greater CUDA crowd will be beneficial.
    Reply
  • turboflame
    joytech22 allow them to compete with OpenCL and Directcompute.
    Except that OpenCL and Directcompute are compatible with all GPUs. CUDA is useless without GPU acceleration.
    Reply
  • bmadd
    Well now you can run CUDA code WITHOUT the need for a Nvidia graphics card and will also allow them to compete with OpenCL and Directcompute. Think about it.. it's all in the text.


    CUDA was toted as being nvidias answer to give exceptional processing power over x86 after Jen-Hsun Huang bashed it for so long.

    Now that port THERE pride and joy to the thing they bagged for so long?
    Reply
  • joytech22
    What i meant was, (When) CUDA is (now) supported on CPU's as well as GPU's it allows more people to use the language without the need to spend thousands, it's just a money saver for some and allows others to mess around with CUDA without needing strict requirements, it should work, just not as fast.
    Reply
  • iam2thecrowe
    Cuda is aussie slang for two in the pink and one in the stink....americans may know this as "shocker" http://www.urbandictionary.com/define.php?term=cuda
    Reply
  • bmadd
    CUDA in also happens to be the best. The bees knees. The ducks nuts. in AU anyways
    Reply
  • alidan
    joytech22Well now you can run CUDA code WITHOUT the need for a Nvidia graphics card and will also allow them to compete with OpenCL and Directcompute.And maybe it's cheaper to use what you have already (example in this case, a Supercomputer with say.. 100 CPU's), and would be cheaper to simply use those 100 CPU's instead of spending more cash on GPU's.it's all in the text.
    4chan made something called tripper for cuda. it runds tripcodes in cuda, arguably the best way to get trip codes you want. now a single core cpu can do i believe 1-5 million trips a second, an i7 920 can do i think 22 million, a gtx285 is capable of almost 2 billion and if its not faked i have seen numbers up to 15 billion but i know for a fact that this is the BEST use of the gpu in practice as a gpgpu. without gou slow, with gpu fast.

    point being that 100cpus are outdone by a quad sli, if you have the ability to get 100cpus, just get 4 cpus.
    Reply
  • ohim
    Is it me or what they are saying is that is posible to run CUDA on ATI/AMD also ? :))
    Reply