Nvidia CEO Says No x86 Chips

CNet reports that, when Nvidia CEO Jen-Hsun Huang was asked about the possibility of Nvidia coming up with its own x86 chip technology in a phone interview Thursday, the chief executive officer answered with a resolute "No."

Rumors that Nvidia had plans to enter the x86 CPU market resurfaced last week. With Nvidia not having the license to produce chipsets for the latest generation of Intel chips and Intel moving towards integrating graphics cores onto its CPUs, analysts speculated that Nvidia was likely to enter the x86 game.

''We believe Nvidia could enter the x86 CPU business,'' said analyst Doug Freedman of Broadpoint AmTech, in an EETimes story. ''Nvidia could become a supplier of x86 CPUs by necessity to preserve both GPU and chipset revenue.''

However it seems Nvidia has other plans for the time being. When asked about the company's plans for an x86 CPU, Huang squashed rumors and detailed Nvidia's plans for the future.

"Nvidia's strategy is very, very clear. I'm very straightforward about it. Right now, more than ever, we have to focus on visual and parallel computing." Huang went on to detail where his company sees its best opportunities for growth. "Our strategy is to proliferate the GPU into all kinds of platforms for growth," he said. "GPUs in servers for parallel computing, for supercomputing--and cloud computing with our GPU is a fabulous growth opportunity--and streaming video."

The CEO also referenced Tegra and the Zune HD in stating that the company is aiming at getting its GPUs into the lowest power platforms and driving mobile computing.

Read the full story on CNET.

  • bfstev
    Not surprising. Obtaining the licensing for x86 would be a nightmare especially after they spit in intel's face with their cartoons.
  • megamanx00
    Well, I think they may be looking to develop a CPU of some kind, just not an x86 chip. After all, it appears that nvidia has been hiring Transmeta engineers. The Transmeta chips were not true x86 chips, but rather use a Very Long Instruction type of microcode. While in the case of transmeta x86 instructions were translated to VLI on the chip itself, S3 used the same technology to implement 2.0 shaders on their GPUs when they got back into the GPU game.

    Anyway, we could see NVIDIA develop a MIPS or other architecture CPU to pair with their Tegra and market it at a super computer or rendering machine. With so many movie studios needing to produce full 3D CGI movies, or simply have heavy CGI effects, nvidia may be able to steel some space away from the traditional providers in this area.
  • eyemaster
    Hasn't the license for x86 32bit expired? I can't remember.
  • neiroatopelcc
    I see no reason why they shouldn't join in the x86 fun when they're back on track with their gpu's
  • ravewulf
    Awww :(

    If they did make one I think they could come up with something interesting
  • deathmustard
    i just want my 8 core 12 threads i11 already.
  • roofus
    neiroatopelccI see no reason why they shouldn't join in the x86 fun when they're back on track with their gpu's
    In a nutshell right there. Even talking about CPU while they are late to the dance with their specialty product would be ridiculous. I take no issue with Nvidia coming into the market as a CPU manufacturer but right now I think they need to tend to their bread and butter products. They are not in the lead anymore and you need to secure the crown in one area before branching out in others or you risk being associated with mediocrity. AMD/ATI is pumping out one good product after another so there wont be another 2 year stretch of re-branding to look forward to.
  • JofaMang
    Another Nvidia statement completely void of gaming support. Big surprise.
  • warezme
    what they should be focusing on is putting out those DX11 Fermi stuff out to the gamer community..., you know the only group that has been bankrolling their ass all these years?
  • hillarymakesmecry
    I want them to produce x86 chips. I've encouraged everyone I know to only buy ATI stuff since we all learned how evil Intel is. I just bought an ATI laptop too.