The 30 Year History of AMD Graphics, In Pictures (Archive)

Status
Not open for further replies.

blackmagnum

Honorable
May 14, 2012
221
0
10,690
Thanks so much ATI/AMD for keeping Nvidia's price in check. Without your competition, Nvidia might charge thousands for their top cards.
 

InvalidError

Titan
Moderator

Without the competition, if nobody can afford the hardware, next to no game will be developed for it and the small potential market gets even smaller, which makes it that much more difficult to justify the R&D expense for such a niche market. Nvidia's new GPUs would also still need to offer compelling value over their previous-gen hardware to generate repeat sales and keep themselves going.

We have had exactly that situation with Intel for the past five years: Intel has been Intel's only meaningful competition and with no compelling reason for people to upgrade, consumer-oriented sales have been going down along with related profits.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I mostly but not entirely agree @INVALIDERROR. A single company would still need to provide adequate incentives for dev's and customers but I do believe that competition helps keep more reality checks in place.

As for Intel. I completely agree. After many recent upgrades on my newer rig I had enough leftover parts to put my old rig back together. I figured enough time had passed that I could finally upgrade the CPU for a reasonable price. LOL. It's a socket 1155 and the I7's still average $300-500. This for a MOBO that's nearly 5 years old. That's the result of Intel's domination. It's still using the 2600k(still totally rocks) I got for it, but it would've been nice to go IVY for the PCIe3.0.

Thank you Michael/Tom's for these GPU history articles of late. A lot of fun to read through for the old memories.
 

Verrin

Distinguished
Jan 11, 2009
97
3
18,635
Ah, the HD5000 series. That was truely AMD's return to dominance. I had been using Nvidia GPUs between when they released the HD2000 and HD4000 series, simply because they weren't meeting my performance standards. They were competitively priced though. But as soon as they released the HD5850, I was back on team red. I still have that GPU kicking around in my HTPC. That thing is a champ.
 

Michael_427

Commendable
Oct 7, 2016
3
0
1,510
Wow, I've owned and probably still have in a drawer, almost every single one of those until the HD8k series. That's a lot of video cards...and also owned nearly that many Nvidia's. I'm old.
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
What I have lost about the history was at what point does nVidia outgrow ATi as a company, and subsequently AMD in terms on raw market share and resources? Can anyone enlighten me?

On topic, ATi certainly gives me great memory of their products.

I played DMC4 with Power Color X1650XT. Even completed Bloody Palace with it. Until it died entirely my fault for OC without logic. I was young and stupid.

COD MW2 and NFS World with HIS HD3650. Until fan failure. Box it, and forget ever have it. Pretty sure the GPU itself was fine.

Tomb Raider 2013 with HIS HD4650 until I get another fan failure, then get myself cheap second hand XFX HD4670.

HIS HD7730 serve me at Skyrim after I sold my HD4670. Until another fan failure.

Fan failure seemingly a common occurrence on HIS cards. I learned I was not alone in this a while back. Apparently, even slight dust buildup can shake it far too much until it lost balance and kill itself. Tells a lot about HIS fan quality. Today, I avoid HIS product like a plague.

Though as of now I use 750Ti, I still miss awesome, low budget, no-pin power cards from AMD. As you can see from my history of cards, I'm a fan of low powered cards. Shame the RX460 not as much as I expected.
 

beoza

Distinguished
Oct 23, 2009
329
0
18,860
I had one of the Radeon DDR 64MB cards, mine also sported Video In Video Out. Was a pretty decent card, had it paired with an socket A AMD Athlon T-bird 900MHz.
 

bit_user

Polypheme
Ambassador
Even with the increase in resources, the Radeon HD 7970's Tahiti GPU was more compact and able to attain higher clock rates than its predecessor.
Yes, because of the process node shrink. But it had 63% more transistors.

You always see smaller die sizes and higher clock speeds, after a process node shrink. Then, as the process matures, yields increase and costs subside, allowing for die sizes to creep back up.

It consumed more power than its predecessor, though.
Well, according to Wikipedia, both it and the HD 6970 had a TDP of 250 W. I think the HD 6970 was their first single GPU to reach that point.

Does anyone know when the PCIe spec started allowing for 250 W per slot? Was it PCIe 2.0?
 

bit_user

Polypheme
Ambassador
The Radeon R9 Fury X outperformed Nvidia's GeForce GTX 980, but traded blows with its GeForce GTX 980 Ti.
Tip of the hat to Nvidia, on Maxwell.

Code:
Product     FP32 (GFLOPS)    Memory (GB/s)    Shaders    Texture Units    ROPs
GTX 980 Ti      5632             336            2816          176          96
R9 Fury X       8602             512            4096          256          64

I don't know how they did it, but the GTX 980 Ti beat Fury X, in spite of the latter's ~52% advantage on compute & memory bandwidth, and 45% advantage on shaders & texture units. I'm sure there's more to Nvidia's coup than its increased ROPs count.

I've bought ATI/AMD cards since 1999. But EVA's 980 Ti FTW edition (factory overclocked, 275 W TDP) is my first Nvidia card. The darn thing has only two fans, and it's surprisingly quiet!

As long as Nvidia can continue its lead on efficiency, I think we won't see AMD catching them anytime soon. Even if Vega features HBM2, I think the best AMD can hope for is a Fury X situation, where it's about equal to the (expected) GTX 1080 Ti.
 
"You can expect higher-end solutions based on the same Polaris architecture in the months to come."

Interesting, I thought the RX 480 was the top end Polaris part and all that was remaining for release was Vega. Or is it expected that Vega is just a bigger version of Polaris?

@RUSH21HIT - I think nVidia made it's biggest jump in market share with the GTX 8xxx series. The 8800 GTS/GT/GTX/Ultra were very popular and it took ATI some time to totally catch up, while nVidia rebadged the 8800's as 9800's and ultimately 200's.
 

TJ Hooker

Titan
Ambassador

I kinda figured the author meant to say "GCN 4" rather than "Polaris" in that sentence.
 

bit_user

Polypheme
Ambassador
The only GFX card ever to die on me (granted, I'm really not a gamer) was the 9700 Pro. Its fan failed, replaced under warranty. Don't remember the brand (could it have been an ATI-branded card?).
 


Newegg stopped carrying HIS products due to shady business practices. I believe the final straw was them failing to honor mail-in-rebates.

I've had a few HIS cards and it was always fan failure that led to replacement. The last HIS GPU in my family was in my brothers computer, a HIS X1650 Pro, and we replaced it when the computer shut down suddenly and upon opening the case, we found pieces of the fan all over the inside. The fan disintegrated and although under warranty, HIS refused to do anything about it. That is when I decided to never own another of their products and Newegg banning them was icing on the cake.
BTW, Newegg made good on the MIR's when they banned them. I got a $20 check in the mail, from Newegg, for that last GPU.
 
G

Guest

Guest
Even at huge lithography, early graphic cards were fanless or had tiny fans. It's only relatively recently that GPUs and CPUs got power hungry and needed insane cooling solutions. And now, we are coming back down full circle. With 7 nm, who even knows if we'll even need massive cooling solution. Maybe a small fan, maybe fanless. At least on mid-class hardware. My last two laptops have been fanless, and I wouldn't consider going back to a laptop with a fan. I can dream, but it would be great to have a silent desktop, without a single moving part.
 

InvalidError

Titan
Moderator

They may have been fanless or tiny-fanned but in terms GPU processing power, they were also 1000X slower than current 80-150W GPUs. My 15W 60Mhz Pentium was fan-less too but my 60W i5 is probably in the neighborhood of 500X faster.

Yes, today's chips use more power than chips from 12+ years ago, but they also deliver about 100X more performance per watt.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780


make no bones about it, amd's drives suck, allot, especially when paired with weaker cpus, so that's a part of what made the 980ti so much better, nvidia made a dx11 asic and amd made a gpgpu monster.

its why when you give amd a dx12 ot vulcan game, the fury can catch up and in some games surpass the current nvidia top end cards.

now, and this is my understanding, the 20nm process fell through, and that's what fury was made for, so moving it back to 28nm meant that a lot of its potential was just gone. this is also part if not the reason amd broke contract and paid a fine to the fam and moved to globalfoundries.




7nm is kind of 7nm, 1 part of the process may be 7nm so they call the whole thing it... if i remember right current amd and nvidia gpu process is closer to 30-40nm, while intel's cpu is closer to 25~ even once they hit 7nm, they still have to refine the process and figure out how to get the rest of it down too.
 

joz

Distinguished
Jun 13, 2008
160
0
18,690
I still fondly remember my Diamond X800 GTO 512MB. What a fantastic card for its days.

Led me through many years of Halo CE.
 

bit_user

Polypheme
Ambassador
I fundamentally disagree. You're confusing CPUs and GPUs.

Now, let's start back in the day. I *think* 90's and early 00's GPUs were less power-hungry simply because shading was simple and they were largely memory-bottlenecked. If you went back in time and tried to build a GPU with all the capabilities of modern ones, then you'd find that even to target resolutions like 640x480, it would be massively power-hungry. The die would also be uneconomically large, which is why it took so long for them to reach the current level of sophistication.

Here's about the closest thing you'll find: http://www.cs.unc.edu/~pxfl/papers/PxFl-hwws97.pdf I have no idea how much power it used, but it sounds like each board had about 4-dozen chips and was functionally equivalent to one small GPU.

Concerning your fanless laptop, there's simply no comparison between its GPU and the 250 W monsters powering high-end gaming desktops. GPUs are a bit more efficient at low power, but GPU thoughput scales pretty linearly with power dissipation.

With CPUs, it's a different story. There's only so much you can do to increase the thoughput of a CPU core, so the efficiency savings of each smaller process node can't entirely be re-invested in performance, other than straight clockspeed. And once you get into the realm of speeds at which high-end desktop CPUs run, it turns out that power increases are pretty nonlinear vs. clock speed. The result is low-power laptop CPU cores that still perform pretty well, compared to their much hotter desktop siblings.

If you don't believe me, look at how GPU performance has continued to improve in the past 10 years, and compare it to improvements in CPU performance. Even performance per watt. You'll see that GPUs were able to effectively re-invest the savings from each new process node into more shader cores, which they could effectively use to deliver higher, real-world performance. Desktop CPUs pretty quickly plateaued at 4 cores, and those cores didn't get a whole lot faster (there's probably not even 3x between a Skylake core and an equivalent Core2 core).

You can have that today, but it comes at a price.

http://www.tomshardware.com/reviews/compulab-airtop-fanless-pc,4595.html

Not only is it expensive (a bit excessively, so), but it only had a GTX 950, which they will presumably replace with a GTX 1050, if they're still making these.
 

InvalidError

Titan
Moderator

What 250W monsters? All current-gen high-end GPUs are now down to 150W-ish and with Pascal, nvidia is now using the exact same GPUs for mobile and desktop.

BTW, power scales linearly with frequency and square with voltage, so if you need to bump voltage by 10% to achieve a 10% overclock, you need about 32% more power.
 

superflykicks03

Distinguished
Sep 9, 2010
56
0
18,640
I have had many Radeons over the years, but the 9800 Pro holds the largest place in my heart. That card was such a beast at the time. I still have it sitting in a drawer. The one pictured in the article is kinda ghetto tho...never saw one that looked like that :)
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
The Radeon 9800 Pro was the <mod edit> back then. No crapvidia hair-blower product could touch it back then.

Watch the language. - G
 

boethius70

Reputable
Feb 16, 2014
1
0
4,510
I remember paying a bloody fortune for the Mach64 back in '94 or '95 because I had this weird obsession with wanting to install NeXTStep x86 on my PC and it was one of the very few PC graphics cards that were supported natively by that OS. Had a lot more money than sense back in those days.
 

bit_user

Polypheme
Ambassador
That's just all kinds of wrong. Stock GTX 1080 is 180 W, but the plain x80 cards were never their top end (the stock GTX 980 was rated at 165 W). The new Titan X is a 250 W card. Expect GTX 1080 Ti and AMD's Vega to be, as well. Fury X is 275 W, actually.

Seriously, bro, this info is quite easy to find. No need to post misinformation.

Yes, at a much lower clock speed. If your laptop is plugged in, there's still a limit to how much you can dissipate in such a chassis, and then there's the limits on its power supply. For mobile, it makes sense to use lots of cores, clocked really low. So, as long as the design scales down well, it's quite logical to reuse the same desktop GPU for mobile. BTW, it's actually not the first time they did that, but maybe the first time they publicly announced it.

From what I can tell, the actual clock speeds of their mobile GPUs are probably determined by the notebook manufacturer. But, when you take the same GPU and clock it lower, guess what happens to the performance?

So, my point stands - high-end GPUs will always run at the power or thermal limits of the platform.
 
Status
Not open for further replies.