The History of Nvidia GPUs: NV1 to Turing

About the author
Michael Justin Allen Sexton & Yannick Guerrini & Pierre Dandumont

Michael Justin Allen Sexton is a Contributing Writer for Tom's Hardware US. He covers hardware component news, specializing in CPUs and motherboards.

Read more
Create a new thread in the US Photo reports comments forum about this subject
17 comments
Comment from the forums
    Your comment
  • kinggremlin
    So, based on the archived comments, this is the THIRD time this article has been posted since the release of Pascal. Why on earth would you keep recycling this article when nothing has been added to the "history?" Turing has not been released yet, we have no benchmarks. Unless this site has no intention of posting reviews of the Turing cards, there is zero reason to re-re-post this just to add an unreleased architecture on the last slide with no useful or new information.
  • bit_user
    Anyone interested in early GPUs (particularly Nvidia's exotic NV1 and its cancelled successor) would probably find this a worthwhile read:

    http://vintage3d.org/nv1.php#sthash.AWlq2ihY.dpbs

    He thoroughly explores their unique quadric rendering approach, including its down-sides and how they tried to mitigate.

    The author of that site has posted on here, in a previous article about vintage 3D cards. Maybe he'll show up, again.
  • Blytz
    I really love to see a graph of the processing power and memory bandwidth of the evolution of these cards (and throw in the ati/radeons as well) to see when we made leaps or increments and how far it's all come.
  • bit_user
    Anonymous said:
    I really love to see a graph of the processing power and memory bandwidth of the evolution of these cards (and throw in the ati/radeons as well) to see when we made leaps or increments and how far it's all come.

    This isn't exactly relevant to graphics performance, but still worth a look.

    https://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/

    Note that the Y-axis of most plots is in log-scale. Also, even though it's from 2013, he updated it for KNL and Pascal. I just wish he'd update it with Volta.

    Edit: If you liked that, you might enjoy his 42 Years of Microprocessor Trend Data.
  • Unsal Ersoz
    Wow, I was a very hardcore gamer back then :)
    TNT2 and Voodoo3 were competitors until 3DFX released its January 2000 drivers. I can remember it like yesterday. That miniGL port for the opengl games like half-life, quake and others that is long forgotten basically blew tnt2 to the dust (I owned both of the hardware). At 2000, Nvidia was very incompetitive from the software perspective compared to 3dfx with voodoo3. I remember my GLquake renders 10720fps timelapse while tnt2 was stuck with ~60ish :)
    Nice memories.
  • samopa
    I am in 3dfx camp until I had forced to switch to nVidia camp. I owned Voodoo1, Banshee, Voodoo2, Voodoo3 3000, Voodoo5 5500 before finally switch to GeForce 6800 Ultra.
    Such a fond memories
  • AgentLozen
    I remember posting a comment on this article the last time it came around.

    I mentioned last time that I liked the video game screenshots in the background. It helps put the hardware into perspective. For example, the TNT was meant to play games like Half Life. The TNT2 was built to take on Quake 3.

    My unique contribution this time is that I would like to see the date of each card's release on each slide. Some entries have this and others don't. I mentioned perspective in my last paragraph and the same applies here. A good analogy for why the date is important would be like taking a history class and hearing "In 1941, America entered World War 2. A short time later, America found itself in Vietnam." So...... was America in Vietnam in that same decade? In the 1950's? The 1960's? It would help my understanding to have a date.

    I did enjoy reading this article even if I've looked at it before.
  • Stephen_144
    The 3dfx was my second gfx card. My second computer, a Pentium 90, had a Diamond Viper Stealth card which I later upgraded to a 3DFX.

    I recall loading Windows for Work groups because I could get a 800x600 resolution all the way up to 256 colors. That was up from 640x480 16 colors in the regular windows 3.0. I recall being amazing all the images and spent the entire day downloading (from my dial up Comp-u-Serve 28k modem) color icons and customizing all my windows to use them. Awww simpler times.
  • steve.d.moss
    Is this another biased article like the one telling us to buy RTX cards without even waiting for concrete proof of performance and at inflated prices? I really hope not.

    The history of Nvidias rise to power is far more convoluted and full of lies, skullduggery and deceit than people realise.

    Check out Adored TV on YouTube. He did a totally unbiased and crystal clear video on this topic a few months back.

    Definitely worth a watch, might even open your eye a little.
  • kinggremlin
    Anonymous said:
    Is this another biased article like the one telling us to buy RTX cards without even waiting for concrete proof of performance and at inflated prices? I really hope not.

    The history of Nvidias rise to power is far more convoluted and full of lies, skullduggery and deceit than people realise.

    Check out Adored TV on YouTube. He did a totally unbiased and crystal clear video on this topic a few months back.

    Definitely worth a watch, might even open your eye a little.



    Name a company that has made it to the level of dominance that Nvidia has without using similar tactics. You're not going to find one. At least Nvidia produces something we want to use.

    Adored TV is the tech equivalent to Alex Jones. You have to have an IQ below 50 to believe anything that comes from either source.
  • bit_user
    Anonymous said:
    The history of Nvidias rise to power is far more convoluted and full of lies, skullduggery and deceit than people realise.

    Now that would be an interesting story. Do a full retrospective of the misteps and misdeeds, as well as the triumphs and successes, in order to put everything in perspective.
  • bit_user
    Anonymous said:
    Name a company that has made it to the level of dominance that Nvidia has without using similar tactics. You're not going to find one. At least Nvidia produces something we want to use.

    I don't know, but I don't think that's grounds for whitewashing what they've done.

    In fact, your very willingness to look aside is actually a signal to others to do the same or worse.
  • kinggremlin
    Anonymous said:
    Anonymous said:
    Name a company that has made it to the level of dominance that Nvidia has without using similar tactics. You're not going to find one. At least Nvidia produces something we want to use.

    I don't know, but I don't think that's grounds for whitewashing what they've done.

    In fact, your very willingness to look aside is actually a signal to others to do the same or worse.


    The company sells video cards. Sorry, I can't find a reason to care if people hate them because they charge what they can to maximize profits. This isn't like Enron that ruined 1000's of people's retirement funds or a company price gouging on water and food during a crisis. There are many companies committing some real atrocities in this world. Nvidia's target market does not give them the leverage to fall into that category. The selective outrage by internet whiners is comical. If you agree that most companies do it, why are you only complaining about one selling video cards?
  • bit_user
    Anonymous said:
    The company sells video cards.

    Even 10 years ago, you might've gotten away with such a characterization. Today, they power the majority of the world's biggest supercomputers, they're the most instrumental player in the AI revolution (which they helped kickstart), and they're a key player enabling robotics and self-driving cars.

    Anonymous said:
    Sorry, I can't find a reason to care if people hate them because they charge what they can to maximize profits. This isn't like Enron that ruined 1000's of people's retirement funds or a company price gouging on water and food during a crisis. There are many companies committing some real atrocities in this world. Nvidia's target market does not give them the leverage to fall into that category.

    If business ethics matters at all, then it matters everywhere. If you value a properly functioning marketplace and believe in the value of competition to drive innovation and efficiency, then you should care if/when someone breaks the rules. Not to mention all the jobs and investments at stake.

    That said, if there are lives lost due to the misdeeds of a company, that definitely raises the stakes beyond mere technological and economic consequences, and should be dealt with accordingly.

    Anonymous said:
    The selective outrage by internet whiners is comical.

    No student of human nature would say that. I think we're biased to care about what's familiar to us and what we interact with.

    People are also tribal, and brands both benefit and suffer from this. I guess my feeling about brand wars is similar to how I regard rivalries between fans of sports teams. If people are going to form tribes around something, wouldn't you rather it be something fairly benign and artificial, as opposed to the sort of rivalries that start wars?

    Anonymous said:
    If you agree that most companies do it, why are you only complaining about one selling video cards?

    I don't necessarily agree with that, but I'm also not on a warpath against Nvidia. I don't actually know much about some of the past controversies people have mentioned, which is one reason I'd be interested in reading some good journalism on the subject. Or talk about Intel, MS, Google, FB, Amazon, or even AMD probably has some skeletons in their closet. No cow should be sacred, even if it is the one pulling your wagon.

    Just because a company has a spotty history doesn't mean they should be shut down. I believe in perspective and proportionality, too.
  • pawel86ck
    Xbox classic GPU also had 2 vertex shader units like geforce 4, so it can be considered geforce 3/4 mix.
  • philipthesaiyan
    Very interesting indeed.