Talking Heads: Motherboard Manager Edition, Q4'10, Part 1

The Future Of Nvidia

Question: One year from now, do you continue to see Nvidia active in designing chipsets, or will the company focus on its core business (discrete graphics solutions)?

  • Currently, Nvidia can only provide entry-level AMD chipsets, and they have always faced a product shortage. Ultimately, we believe Nvidia's motherboard chipset product will become an added service item for most vendors--i.e. 3D applications.
  • Nvidia does not get as much profit from UMA solution as discrete VGA.
  • We really don't know…
  • It seems obvious Nvidia will not be active in the MCP business. But I supposed they will not and cannot count on discrete graphics alone, maybe they will drive toward applications relating to 3D solutions.

The current situation makes everyone a bit unsure about Nvidia’s future. While Nvidia is nowhere near out of the fight, the “loss” of its chipset business gives the impression that revenue options outside of the graphics world are shrinking. The path Nvidia is on (at least on the desktop), seems to be limited to discrete graphics. This is reflected in what we are hearing from our sources in the motherboard business. Nvidia is keeping tight-lipped, since even its traditional motherboard partners are seemingly in the dark. The comment that best sums this up comes from one Nvidia technology partner: “I supposed they will not and cannot count on discrete graphics alone.”

About half feel that Nvidia is going to have no choice but to refocus on its graphics business. There are some out there that feel Nvidia may win its lawsuit and get the all-clear to develop chipsets for Intel's Core ix-series CPUs, but a third of our participants think there is a going to be a different end game. Oddly enough, someone even brought up the idea of a merger. Another cited Nvidia’s focus on 3D entertainment, which our readers feel is more of a gimmick due to the price tag.

The situation with Nvidia and Core ix-based chipsets is basically written in stone until the 2009 chipset lawsuit gets a judgment later this year. For the moment, Nvidia's chipset business is forcibly idle, and we've seen no discussion of them in roadmaps that we've seen. To that end, though, we're unsure if an Nvidia chipset for Intel CPUs makes sense anyway. At one time, superior memory controller performance and a relatively badass audio subsystem were real reasons to consider nForce as an alternative. But as integration pushes more functionality into the CPU itself, any third-party chipset vendor's chances to differentiate diminish substantially.

Rewind the clock back a few years, the main reason Nvidia was even able to acquire a chipset license was due to AMD’s CrossFire threat. It doesn’t seem to be in Intel’s interest to shorten its reach now that it has its own graphics "solution" and Nvidia’s CUDA may marginalize the performance delivered by a CPU. Now that we all know SLI is simply a licensing matter not dependent on specific hardware, Intel's board partners are able to cover CrossFire and SLI in very enthusiast-friendly X58- and P55-based platforms.

Nvidia, meanwhile, proclaims it has moved to focus on to its Tegra processors, which is by no short measure a bluff. Nvidia’s CEO recently reiterated this focus, as the company sees ARM as a huge growth opportunity. If you look at the financial statements, R&D has been given a big budget increase, despite a drop in the company’s year-to-year sales. We have mixed feelings here if only because the ARM market, though loaded with potential, is dominated by Qualcomm and Texas Instruments. Nvidia’s Android-based tablet demos generated a lot of buzz, but it’s still not clear a product in that vein can seriously compete with Apple’s iPad.

Recent FTC settlement only add more drama to the situation. The settlement effectively solidifies the PCI Express standard, as Intel now must provide bus support for another six years. However, this is as much a protection for AMD’s discrete graphic business as it is to Nvidia’s. The settlement in no way impacts the current lawsuit with Intel. Recently, the company issued the statement regarding the FTC settlement: “Nvidia supports the FTC's action to address Intel's continuing global anticompetitive conduct. Any steps that lead a more competitive environment for our industry are good for the consumer. We look forward to Intel's actions being examined further by the Delaware courts later this year, when our lawsuit against the company is heard.” Obviously, Nvidia wants to keep the chipset business if it can.

In Q2, Nvidia issued a warning estimate that lowered its earnings 16%. This came as a surprise to many, if only because Intel and AMD both issued strong earnings. This is just another indicator of how dependent Nvidia has become on the sales of its high-end cards. As discretionary spending decreased, AMD reaped the benefits of its timely Radeon HD 5000-series cards. The fact that Nvidia is only now releasing its DX11 Fermi-based cards for the mainstream and low-end market spaces means it has some catching up to do.

  • dannyboy3210
    I seem to have this nagging feeling that discrete graphics options will probably be around for another 10-15 years, at the least.
    If you factor the fact that getting a fusion of cpu/gpu will cost a bit more than a simple cpu, if you plan on doing any gaming at all, why not invest an extra 30$ or so (over the cost of cpu/gpu fusion, not just cpu) and get something that will game like twice as well and likely have support for more monitors to boot?

    Edit: Although after the slow release of Fermi, I bet everyone's wondering what exactly is in store for Nvidia in the near future; like this article says, there seems to be a lot of ambivalence on the subject.
    Reply
  • sudeshc
    I would rather like improvements in chipsets then in CPU GPU they already are doing a wow job, but we need chipsets with less and less limitation and bottlenecks.
    Reply
  • ta152h
    I'm kind of confused why you guys are jumping on 64-bit code not being common. There's no point for most applications, unless you like taking more memory and running slower. 32-bit code is denser, and therefore improves cache hit rates, and helps other apps have higher cache hit rates.

    Unless you need more memory, or are adding numbers more than over 2 billion, there's absolutely no point in it. 8-bit to 16-bit was huge, since adding over 128 is pretty common. 16-bit to 32-bit was huge, because segments were a pain in the neck, and 32-bit mode essentially removed that. Plus, adding over 32K isn't that uncommon. 64-bit mode adds some registers, and things like that, but even with that, often times is slower than 32-bit coding.

    SSE and SSE2 would be better comparisons. Four years after they were introduced, they had pretty good support.

    It's hard to imagine discrete graphic cards lasting indefinitely. They will more likely go the way of the math co-processor, but not in the near future. Low latency should make a big difference, but I would guess it might not happen unless Intel introduces a uniform instruction set, or basically adds it to the processor/GPU complex, for graphics cards, which would allow for greater compiler efficiency, and stronger integration. I'm a little surprised they haven't attempted to, but that would leave NVIDIA out in the cold, and maybe there are non-technical reasons they haven't done that yet.
    Reply
  • sohaib_96
    cant we get an integrated gpu as powerful as a discrete one??
    Reply
  • Draven35
    CUDA was a fairly robust interface from the get-go. If you wanted to do any sort of scientific computational work, Nvidia's CUDA was the library to use. It set the standard. Unfortunately, as with many technologies in the PC industry kept proprietary, this has also limited CUDA's appeal beyond specialized scientific applications, where the software is so niche that it can demand a certain piece of hardware.

    A lot of scientific software vendors I have communicated with about this sort of thing actually have been hesitant to code for CUDA because until the release of the Fermi cards, the floating-point support in CUDA was only single-precision floating point. They were *very* excited about the hardware releases at SIGGRAPH...
    Reply
  • enzo matrix
    Odd how everyone ignored workstation graphics, even when asked about them in the last question.
    Reply
  • K2N hater
    That will only replace discrete video cards once motherboards ship with dedicated RAM for video and the CPU allows a dedicated bus for that.

    Until then the performance of the processors with integrated GPU will be pretty much the same as platforms with integrated graphics as the bottleneck will still be RAM latency and bandwidth.
    Reply
  • elbert
    The death of discrete will never occur because the hybrids are limited like consoles. Even if the CPU makers could place large amounts of resources on the hybrid GPU they will be stripped away by refreshes. The margin of error being estimating how many thought motherboard integrated graphics would kill discrete kind of kills the percentages.

    From what I have read AMD's Llano hybrid gpu is about the equal to a 5570. Llano by next year has no chance of killing sales of $50+ discrete solutions. I think they hybrids will have little effect on discrete solutions and your $150+ is off. The only thing hybrid means is potentially more CPU performance when a discrete is used. Another difference will be unlike motherboard integrated GPU's going to waste the hybrids will use the integrated GPU for other tasks.
    Reply
  • Onus
    sohaib_96cant we get an integrated gpu as powerful as a discrete one??No. There are two reasons that come to my mind. The first is heat. It is hard to dissipate that much heat in such a small area. Look at how huge both graphics card and CPU coolers already are, even the stock ones.
    The second is defect rate in manufacturing. As the die gets bigger, the chances of a defect grow, and it's either a geometric or exponential growth. The yields would be so low as to make the "good" dies prohibitively expensive.
    If you scale either of those down enough to overcome these problems, you end up with something too weak to be useful.
    Reply
  • Onus
    elbert...From what I have read AMD's Llano hybrid gpu is about the equal to a 5570. Llano by next year has no chance of killing sales of $50+ discrete solutions...Although the reasoning around this is mostly sound, I'd say your price point is off. Make that $100+ discrete solutions. A typical home user will be quite satisfied with HD5570-level performance, even able to play many games using lowered settings and/or resolution. As economic realities cause people to choose to do more with less, they will realize that this level of performance will do quite nicely for them. A $50 discrete card doesn't add a whole lot, but $100 very definitely does, and might be the jump that becomes worth taking.
    Reply