G70 As GeForce 7800 GTX, June 2005
GeForce 7800 GTX.
* 256MB of memory
* 256-bits Memory Interface Width
* 430 MHz Core Clock
* 1,296 MHz Processor Clock
* 1.2 GHz Memory Clock
* Texture Fill Rate 10.32 Gigatexel/s
* Memory Bandwidth (GB/s) 38.4
G80 As 8800 GTX, 2006
GeForce 8800 GTX.
For many gamers and industry analysts alike, the 8800 series is still considered to be one of the best GPUs Nvidia ever released, and in fact, is still widely used today. Although the 8800 GTX is several years old now, it still performs very well with today's games. Do you still have a 8800 GTX in use?
* 768MB of memory
* 384-bits Memory Interface Width
* 575 MHz Core Clock
* 1,350 MHz Shader Clock
* 900 MHz Memory Clock
* Texture Fill Rate (billion/s) 36.8
* Memory Bandwidth (GB/s) 86.4
G92 As 8800GT October, 2007
GeForce 8800GT.
Essentially a 65nm version of the G80 GPU. Despite its lower power usage and smaller die size, the G92 was slightly gimped compared to the G80 due to its 256-bit memory interface compared to the G80's 384-bit interface. Consequently, performance wasn't as good. The G92 GPU was used on more entry level and mainstream cards.
* 112 Stream Processors
* 512-1,024MB of GDDR3 memory
* 256-bits Memory Interface Width
* 600 MHz Graphics Clock
* 1,500 MHz Processor Clock
* 900 MHz Memory Clock
* Texture Fill Rate 33.6 Gigatexel/s
* Memory Bandwidth (GB/s) 57.6
GT200 As GeForce GTX 280, June 2008
GeForce GTX 200-series.
Currently Nvidia's top offering. The GPUs based on the GT200 GPU rules the roost for Nvidia cards, that is, until Fermi (GF100) arrives.
Cards based on the GT200 and its variants series are GeForce 2XX series ranging from the GeForce 210 all the way up to the GeForce GTX 295, which sports two GPUs on one board.
* 240 CUDA Cores
* 1GB of memory
* 512-bits Memory Interface Width
* 602 MHz Graphics Clock
* 1,296 MHz Processor Clock
* 1,107 MHz Memory Clock
* Texture Fill Rate 48.2 Gigatexel/s
* Memory Bandwidth (GB/s) 141.7
While we're missing a few GPU variants from Nvidia, we were only given these photos, but as more trickle in, we'll update the article!