Nvidia CEO On Intel's GPU, AMD Partnership, And Raja Koduri

Nvidia held its 3Q 2017 earnings call today, and as expected, the company had a stellar quarter. Top line numbers include a record $2.64 billion in revenue, led by a 25% growth in gaming and a whopping 109% YoY growth in the data center. Gross margins weighed in at 59.7%, so it's clear the company is executing well.

But aside from the financial numbers, what everyone really wants to know is Nvidia's take on the recent industry announcements that will certainly have a profound and lasting impact on the GPU industry.

First came Intel and AMD's bombshell announcement that AMD's Radeon Graphics are worming their way into Intel's eighth-generation H-Series processors. As surprising as that is considering the acrimonious history between the two companies, it was just the start.

A day later, Raja Koduri, AMD's senior vice president and chief architect of the AMD Radeon Technologies Group (RTG), announced he was leaving the company. This comes on the heels of Koduri's extended leave of absence from the company shortly after the launch of the Vega graphics cards.

Twenty-four hours later, Intel announced that it had brought on Koduri to head its newly formed Core and Visual Computing Group business unit with the intention of developing high-end discrete graphics cards for a "broad range of computing segments." That's shocking because Nvidia and AMD have been the primary two discrete GPU producers for the last 20 years.

All of these announcements have a tremendous impact on the wildly successful Nvidia, but the company avoided addressing the recent news during its opening statements in the financial call. But at the end of the Q&A portion, in response to a question regarding Intel's renewed interest in developing a discrete GPU and its newfound partnership with AMD, Huang responded:

"Yeah, there's a lot of news out there....first of all, Raja leaving AMD is a great loss for AMD, and it's a recognition by Intel probably that the GPU is just incredibly important now. The modern GPU is not a graphics accelerator, we just left the letter "G" in there, but these processors are domain-specific parallel accelerators, and they are enormously complex, they are the most complex processors built by anybody on the planet today. And that's the reason why IBM uses our processors for the worlds largest supercomputers, [and] that's the reason why every single cloud, every major server around the world has adopted Nvidia GPUs."

Huang's statement aligns with our thoughts that Intel's return to the discrete graphics industry is more centered on capturing some of the burgeoning use-cases for parallelized workloads, such as AI, than it is about gaming. Nvidia's push into the data center has been fruitful, as evidenced by the 109% YoY revenue growth, but it's really just the beginning of GPU penetration into several other segments, such as autonomous driving.

Huang dove further into the company's design process:

"The amount of software engineering that goes on top of it is significant as well. So, if you look at the way that we do things, we plan our roadmap about five years out. It takes about three years to build a new generation, and we build multiple GPUs at the same time, and on top of that, there are some 5,000 engineers working on system software and numerics libraries, and solvers, and compilers, and graph analytics, and cloud platforms, and virtualization stacks in order to make this computing architecture useful to all of the people we serve. So when you think about it from that perspective, it's just an enormous undertaking. Arguably the most significant undertaking of any processor in the world today. And that's why we are able to speed up applications by a factor of 100."

This statement highlights the complexity of developing a new GPU. Intel will face similarly significant challenges. 

Huang also addressed the new Intel H-Series processors that feature AMD's semi-custom Radeon Graphics chip:

"And lastly, with respect to the chip that they built together, I think it goes without saying, now that the energy efficiency of Pascal GeForce and the MaxQ design technology and all of the software we have created has really set a new design point for the industry, it is now possible to build a state of the art gaming notebook with the most leading edge GeForce processors, and we want to deliver gaming experiences many times that of a console in 4K and have that be in a laptop that is 18mm thin. The combination of Pascal and MaxQ has really raised the bar, and that's really the essence of it."

That was the last of Huang's statements on the matter, though some of his responses to other questions during the call are telling. Huang prominently repeated the statement that "We are a one architecture company." Huang said the company has a singular focus on one architecture so they can assure broad compatibility with all aspects of the software ecosystem, not to mention assuring support longevity, stating, "We support our software for as long as we shall live."

Earlier in the call, Huang also pressed the point that investing in five different architectures dilutes focus and makes it impossible to support them forever, which has long-term implications for customers. Huang drove the point further:

"If you have four or five different architectures to support, that you offer to your customers, and they have to pick the one that works the best, you are essentially are saying that you don't know which one is the best [.....] If there's five architectures, surely over time, 80% of them will be wrong. I think that our advantage is that we are singularly focused."

Huang didn't specifically name Intel in this statement, but Nvidia's focus on a single architecture stands in stark contrast to Intel's approach of offering five (coincidence?) different solutions, such as CPUs, Xeon Phi, FPGAs, ASICs, and now GPUs, for parallel workloads.

As others have opined, Intel's announcement that it's building a discrete GPU is tantamount to an open declaration of war on Nvidia. It wouldn't make sense for Intel to telegraph its intentions to its rivals several years before a product comes to market, so the real question is just how far along Intel's GPU already is in development. The company could already have a new architecture, and it's possible it is just a scaled-up iteration of its existing iGPU technology.

All these questions and more hang thick in the air, but it's anyone's guess how long we will have to wait for answers. If Intel is just beginning the effort, it could be years before a product makes its way to market.

Paul Alcorn
Managing Editor: News and Emerging Tech

Paul Alcorn is the Managing Editor: News and Emerging Tech for Tom's Hardware US. He also writes news and reviews on CPUs, storage, and enterprise hardware.

  • rush21hit
    "We support our software for as long as we shall live"
    There's your trigger, pre-Maxwell owner.
    Reply
  • dark_knight
    Now if only NVidia could start its own CPU division then we will have real competition around CPU and GPU.
    Reply
  • Joao Ribeiro
    They already make CPU's, forget not the Tegra etc
    Reply
  • iam2thecrowe
    "Nvidia and AMD have been the only two discrete GPU producers for the last 20 years. "

    I'm probably showing my age, but i do remember companies such as 3DFX, S3, Matrox and even Intel that all produced cards less than 20 years ago. http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-9.html
    Reply
  • arielmansur
    It's not bad to keep calling them GPUs, it's just that the G doesn't stand for Graphics anymore, more like General, according to the payload.
    Reply
  • gdmaclew
    20362859 said:
    "Nvidia and AMD have been the only two discrete GPU producers for the last 20 years. "

    I'm probably showing my age, but i do remember companies such as 3DFX, S3, Matrox and even Intel that all produced cards less than 20 years ago. http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-9.html

    You are right. I remember buying Matrox video cards for my company up until 10 years ago. They were quite competitive with nVidia and ATi at the time.
    Reply
  • jleppard
    LOL how can anyone forget Tegra it is the worse ARM processor and why no one uses it.
    Reply
  • ledhead11
    I agree with his statement about how the "G" is misleading at this point. I also agree that these processors have evolved to unprecedented complexity compared CPU's. Let's face it, Intel or AMD cpu's haven't truly advanced nearly as fast as cards for the last 10 years-moore's law in full effect.

    I think if Intel really manages to produce something with quality performance and costs it should be a win/win for us the consumers. NV's dominance is kind of scary(even though I love my TI).
    Reply
  • termathor
    "I'm probably showing my age, but i do remember companies such as 3DFX, S3, Matrox and even Intel that all produced cards less than 20 years ago. http://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-9.html"

    3DFX was acquired by Nvidia early 2000s I think. Probably it was Nvidia's foundation in terms of IP....
    Matrox seems to still be in business, though.
    S3 ? Was it not a 3DFX product ....
    Reply
  • shrapnel_indie
    Huang prominently repeated the statement that "We are a one architecture company."

    While that has nice sounding implications, it does mean a few other things:
    * We are almost a one-size-fits-all company. If our tech doesn't work for you, you should reconsider what you're doing.
    * We are in deep trouble if something else comes out and it can beat everything we have in the pipeline for the next 5 years, and what we have in the early stages for years beyond.


    More than one architecture can mean the following:
    * If one tech becomes obsolete overnight, we can weather the change easier.
    * If one tech doesn't fit your application, another one might.
    * We can have various techs more focused on a specific task instead of trying to be a jack-of-all-trades.


    {...} stating, "We support our software for as long as we shall live." {...} Huang also pressed the point that investing in five different architectures dilutes focus and makes it impossible to support them forever, which has long-term implications for customers.
    And as pointed out earlier.... This is a bit of a lie... unless you decide to redefine this as software only and the totally underlying hardware is excluded.... Otherwise they best be still supporting their old Riva series, which we know they are not. (It's still good enough for basic desktop use for offices and such where there is no heavy graphics or any compute needs.)
    Reply