Plausible: Nvidia Working on x86 CPU

For the second time in less than a week, graphics chipset maker Nvidia is at the center of a rumor that could turn the tech sectors upside down.

Last week, the California-based GPU maker was rumored to be pushed out of the console graphics market by none other than Intel. This week, it seems as if the tables have turned, with Nvidia reportedly working on an x86 CPU.

According to The Inquirer, the GPU powerhouse is trying to produce an x86 chip. While the legal implications may stop anything concrete dead in its tracks, that probably wouldn't stop Nvidia from producing the hardware and worrying about a financial settlement later. When Nvidia started collaborating with Stexar back in 2006, many were predicting that some sort of CPU would be the result. It's been over two years, so the market may finally be privy to the fruit of Nvidia's labors.

"Word reached us a bit ago that Nvidia is definitely working on an x86 chip and the firm is heavily recruiting x86 engineers all over Silicon Valley," says The Inquirer.

While producing an x86 CPU would certainly put a wrench in the works for Intel and AMD, the move seems to be off message for Nvidia. For the past year or so, Nvidia has been pushing its GPGPU, or General Purpose Graphics Processing Unit, concept. This rumored Nvidia CPU would completely go against that grain.

While Nvidia may not be able to snag a license from Intel, there may be another option. If Nvidia collaborated with a company that already possesses an x86 license, VIA for example, you may see an Nvidia-branded CPU yet.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • one-shot
    I remember reading this on the Inquirer a few days ago. It may be interesting to see Intel come down on them.
  • TheFace
    Competition is a great thing for the market. I hope it's true.
  • gwellin
    I could be wrong but isn't x86 32-bit? I would think if you're going to invest a crapload of money into developing a new chip, shouldn't you probably do a 64-bit one? Unless you think 32-bit is going to be around forever.