Rumor: Nvidia G300 to be big, hugely powerful [TR]

dattimr

Distinguished
Apr 5, 2008
665
0
18,980
http://www.techreport.com/discussions.x/16959

Last we heard, both AMD and Nvidia were prepping next-gen graphics processors for the fourth quarter of this year. The guys at Hardware-Infos now say Nvidia has taped out its upcoming G300 graphics chip, and they've posted some specifications for the product.

If the Google translation hasn't warped the post's meaning beyond recognition, Hardware-Infos says the G300 will have 512 stream processors, almost 2.5 gigaFLOPS of number-crunching power, 1-2GB of memory, and 281.6GB/s of bandwidth. The memory will supposedly run at 1,100MHz, so assuming GDDR5, that would mean a 512-bit interface width.

As for the gigaFLOPS figure, the site goes on to say the G300's stream processors are no longer SIMD elements but "MIMD-like" (multiple-instruction, multiple-data) units. That could imply more complex, more flexible hardware, although the report doesn't talk about gaming performance.

With that said, a graphics processor with that much floating-point power and memory bandwidth probably isn't going to be small. And indeed, a previous post from the same source talks of a whopping 2.4 billion transistors—a billion more than the GT200.

Original source: http://translate.google.com/translate?prev=hp&hl=en&js=n&u=http%3A%2F%2Fwww.hardware-infos.com%2Fnews.php%3Fnews%3D2954&sl=de&tl=en&history_state0=
 

JeanLuc

Distinguished
Oct 21, 2002
979
0
18,990
Q4 2009............I hope not, I was hoping new cards would start to show their heads around about end of Q2 start or Q3 2009.
 

smithereen

Distinguished
Oct 4, 2008
1,088
0
19,310
Anyone remember that huge rant somebody posted about how nVidia was going to get raped this round? I'm curious now, he seems to have gotten at least one point right... these are going to be motehrf*cking huge. Huge! And really expensive. It reminds me of Phenom IIs vs C2Q and i7s, i7s being AMD's dual-card solutions. Competitive, but one side rakes in the dough and the other doesn't. Not a perfect analogy, but...
 

turboflame

Distinguished
Aug 6, 2006
1,046
0
19,290


Well that was a die shrink from 90nm to 65nm and a transistor increase from 686 to 754 million transistors. It wasn't much of a jump in performance either.

1 billion extra transistors is quite a doozy, even if this thing is released at 40nm it'll definitely use more power than G200. It will probably be larger as well.

This design is absolutely absurd IMO. All ATI has to do is repeat what it did this generation, get a decent midrange GPU and then slap two of them together to compete in the high end. It doesn't look like Nvidia could put together two GT300s on the same board like they did with the GTX295, at least not at 40nm.