Nvidia's 40nm GT212 to have 384 SPs, 96 TMUs, and 256-bit 7Gbps GDDR5

AuDioFreaK39

Distinguished
Jun 7, 2007
139
0
18,680
Nvidia's 40nm GT212 to have 384 SPs, 96 TMUs, and 256-bit 7Gbps GDDR5

nvidiadu8.gif


half-step between GT200b and GT300




According to a recent report from Hardware-Infos, the specifications of Nvidia's coming flagship 40nm GT212 architecture have been identified and revealed from unnamed sources near Nvidia, as usual. To start matters off, GT212 is the successor to 55nm GT200b which is currently rolling into production. In comparison, the upcoming architecture is essentially following two similar footsteps of 65nm G92 last March - a die shrink to a smaller fabrication process and a decrease in memory interface width.


Continue Reading




nvidiagt212specificatioxf5.png
 

AuDioFreaK39

Distinguished
Jun 7, 2007
139
0
18,680


It reduces the complexity of the PCB, which means lower manufacturing costs for Nvidia and *hopefully* lower retail costs for consumers.

Besides, the bandwidth between 512-bit GDDR3 and 256-bit GDDR5 doesn't make much of a difference to be honest.
 

rawsteel

Distinguished
Oct 5, 2006
538
0
18,990
wow, nVidia seems to recover pretty fast! I am surprised even. Q2 expectations is great. I hope that drives ATI to release the RV870 at the same time frame and have another battle again :D

I cant even imagine the performance we should get this summer, if rumors are true for Radeons with 4x GPUs, however I hope they resolve some driver problems with multi GPU cards and get their performance more constant.

We live in great times !
 
If youre information is moving twice as fast to half as many connections, its essentually a tie as for speed, or 1000 Mhz thru a 512 but is = to 2000Mhz thru a 256 bus. Then, not having the extra connections on your pcb reduces costs, also, GDDR5 elimantes the wire tracing length requirements needed on GDDR3, as it (GDDR5) equalizes the power and resistance on the wire leads, where GDDR3 cant, and those wires have to be precisely laid out. Having twice as many connections also requires extra layers for the pcb as well, so its a much cheaper solution, only off set somewhat by the higher costs of GDDR5, which by the time we see these cards, should be closer to GDDR3 in pricing/costs, and also more than twice as fast as well
 
Also remember there's nowhere to go with GDDR3 it's at it's speed limit, so they had to move to GDDR5 and with that 512bit is overkill so no need to bother with it if you can do it cheaper and well enough with 256bit.

Looking at the design I'm thinking 24 clusters of 8 x 3 SPUs with 3 TMUs tied to them as well. ROP count will be interesting, but as alluded to they don't have to be 16, they could be 32 it just limits module support.
 

hannibal

Distinguished
Ok. It seems that power vs price will be closer between ATI and Nvidia guite soon. Back to "good ole times" when the crown was transferred between these two, periodically :)
It's just that ATI is still only making middle range GPU's, but really good ones at this moment ;-)
It will be interesting to see if ATI can improve the scaling of their multi GPU solution vs. Nvidia big GPU... The previous round vent to ATI, but the next one will be allso interesting! The bigger guestion is, how about Nvidia middle range solution?
 

doomsdaydave11

Distinguished
Oct 16, 2007
935
0
18,980
Thats good. The step from 512-bit GDDR3 to 256-bit GDDR5 may be somewhat of a step to the side, but it's MUCH easier to produce, which means that NVIDIA will be able to compete much better.

As far as i know, NVIDIA's GT200 chips are super super expensive to produce, and NVIDIA is not making nearly as much profit as they would like to.... Remember that even the GTX260 was $450 when it came out 7 months ago, but AMD's fantastic HD48xx series took them by surprise, pretty much forcing them to lower their prices.

GDDR5 is awesome.
 

idisarmu

Distinguished
Mar 23, 2008
511
0
18,980
but WHY not 384bit GDDR5?????

This seems suspiciously similar to the G80 to G92 situation... wouldn't it have been better if nvidia had skipped G92 and built a G80b @ 65nm?

I think this will be just like "back when" nvidia moved from 384bit GDDR3 to 256bit Gddr3 with slightly higher clocks...

Has Gddr5 gotten THAT MUCH FASTER over this short period of time? 1250mhz already- just a year after the 900mhz GDDR5 HD4870 release?

By the way: That <300mm^2 and <1800 transistors sounds bullocks.

How can they increase the amount of shaders by 50% while increasing the trannys only ~25%.... the extra transistors required for the 512bit interface can't be THAT much can it?

Oh well, I just want to see a 40nm 8800GTX with 192bit or 224bit GDDR5.... I wonder what the power consumption would be on that.

 
Why not? Because 7Gbps GDDR5 on a 256 bit bus is actually faster than the current memory (GDDR3 on a 512 bit bus). It may seem like a step backwards, but the difference between this and the G80->G92 transition is that in that case, the memory stayed the same (GDDR3 in both cases), handicapping the G92's bandwidth, while in this case, the bandwidth will actually increase, despite the shrink in bus width.
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
i was having a discussion with a guy, i said that nvidia would go ddr5, he said never, it feels good to be right, would like to say to him, how do you like them bananas
 
Whether they do GDDR5 or not is just ignoring the future if they dont. Just like DX10.1, even tho by itself its not a deal breaker, without DX10.1, you cant have DX11, so either way, you still have tp incorporate its abilities for future use. I wouldnt listen to anyone that told me never for GDDR5, cause itd make me wonder what else they may believe heheh
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
i know dx10.1s not a deal breaker, its that and a long list of bad judgment by nvidia, the biggest one for me is overcharging closely followed by the image quality for FPS thing, i would have loved to go with the 8800gt when it came out, but as i say bad decisions by nvidia (in the past) stopped me, ive got to say that if more ppl where like me, companies would have to think twice when they where deciding there retail price, and we would have cheaper cards all round
 
Ranger, I hope your not refering to that Fudzilla thread you posted, cause that said GDDR5 for the G200b, not for the GT212, and it also said GDDR5 on 512bit.

I don't know who you're talking about but depending on the context I can understand the 'never' just like so many people were talking about those very same rumours before, which won't happen.

A 256bit GDDR5 refresh was pretty much on the table since about the second week of the HD4K launch, but a mega monolithic G200 refresh was always seen as a 'never' situation (especially when discussing what to do about an already expensive chip).

Anywho ala Bond: "Never Say Never Again"
 

ilikegirls

Distinguished
Jan 26, 2009
702
0
19,010

Not only is it over kill! it will also be really expensive so not as many people will buy it, so they make it cheaper and better all in one! thats the fream for any business !
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810
Dead thread was dead. I have heard nothing on this since.. I wouldn't count on seeing a GT212.

In the future don't res an old thread on a spesific topic unless you plan to add to the topic.. Here I was thinking some new info on this card ever being made was at hand :D. I miss the talk about card naming! We need a GTX289.9+ and an ATI 4995x2XT