G80 GTX too big?

minim3

Distinguished
Aug 8, 2006
297
0
18,780
does everyone else gets the idea that if they shrunk the g80 with say 65nm die size it would be much better? It has 681million transistors ffs!!! @ 90nm!!!!
that doesn't make much sense to me. Not to mention heat dissipation and power consumption.
So how far down the road until nvidia switch to 65nm???
 

djplanet

Distinguished
Aug 27, 2006
489
0
18,780
Hard to say. Nvidia may develop and release it with their next crop of DX10 cards (8900 series?). ATI will release R600 early next year on an 80nm process.

Both will switch eventually, I'm quite sure, just give it 6 months, maybe a year.
 

Rashind

Distinguished
Sep 17, 2006
91
0
18,630
Of course it would be better if they shrunk the die. It'd have lower power consumption and less heat. Unfortunately, they don't have the fab for it in place at the moment. I'm sure the other members of the 8000-series family will be produced with a 65nm process, and they may even switch the 8800 over, but for right now it's just not there.
 

michiganteddybear

Distinguished
Oct 4, 2006
325
0
18,780
shrinking the die size wont really make huge impact on the board size, unless the die shrink really cuts the power consumption enough to make the voltage regulators a lot smaller.

It will allow a smaller heatsink.
 

sailer

Splendid
So how far down the road until nvidia switch to 65nm???

They'll switch about time that turning on a computer causes brownouts or blackouts. AMD and Intel have both been struggling with the shift to 65nm. Until some real progress is made by the major cpu companies, I doubt the gpu companies will be in a big hurry to change. Never know for sure, though.

It may sound easy to switch to 65 nm wire, but try to imagine how extremely thin that is, and how easy it is to break that thin wire and therefore end up with a dead processor. The smaller the wires get, the harder it is to make them smaller yet. If you think 90 nm is big, do a bit of research and look at how thick the wire used to be. Then you may wonder how we got to 90 nm at all.
 

mark8987

Distinguished
Jul 6, 2006
63
0
18,630
that doesn't make much sense to me.

Practically everyone else is complaining that the 8800 cards are too expensive, and yet you want them to be even more expensive by using a much more expensive and difficult process to manufacture them??
 

minim3

Distinguished
Aug 8, 2006
297
0
18,780
@mark8987: but wouldn't a smaller manufacturing process mean more gpu's at the same waffer?

@Sailer: I never said it is easy, but considering the money they are spending you can't expect an answer like: they switch when they switch. It seems to me that 681million transistors and huge voltage regulators point that 90nm isn't sufficient.
 

sailer

Splendid
@Sailer: I never said it is easy, but considering the money they are spending you can't expect an answer like: they switch when they switch. It seems to me that 681million transistors and huge voltage regulators point that 90nm isn't sufficient.

What should I answer with? They are the companies doing the research, spending the money, making the cost/benefit analysis, so they get the responsibility to make the switch to 65nm when they think its worth it to make the switch. They aren't going to ask my opinion or yours.

As far as the number of transistors and voltage regulators, that has little to do with wire size. Wire size will determine the amount power consumed and of subsequent heat produced, but the not the number of transistors and power regulators.
 

Mex

Distinguished
Feb 17, 2005
479
0
18,780
I believe nVidia planned to fab the G80 at 80nm originally, but they decided to hold off on the switch and build the G80 on the existing 90nm process instead - thatway, they could get their DX10 card out before ATi did.

The other reason I thinking of is that the companies traditionally don't like to gamble their flagship GPU's on an untested fabrication process. When nVidia released the 7800 series, they were done on the 110nm process (to my knowledge). They experiment with the new process on their lower end GPUs, perfect it (to a reasonable degree), then switch their high-end GPUs over to it to minimize risk - just imagine if a company released their flagship and the process type was riddled with isues?
 
Yes, but so what? Cost of the wafers is a relatively small cost overall.

It's not just the cost of the waffers which is far from a 'small cost' but the yield per wafer. The greater the yield, the better the margins and the lower the per cost expenses, which are very high for the chips, also remember that it's the chips that nV sells not the boards.

Of course if they skipped to the 65nm process and in so doing increased failure rate they might negate the potential savings, just like increase the clock rates usually negates any power and heat savings they could've gotten from the same speed of the previous fab.
 

darkstar782

Distinguished
Dec 24, 2005
1,375
0
19,280
Dont forget ATi and nVidia do not have their own Fabs (well, AMD who own ATi do, but their best process is 90nm and they are already working at 100% capacity).

As such nVidia and ATi are dependant on TSMC and sinilar having fabs ready at that gate size. 65nm just isnt there in volume yet.
 
Dont forget ATi and nVidia do not have their own Fabs (well, AMD who own ATi do, but their best process is 90nm and they are already working at 100% capacity).

Actually AMD has had 65nm production capacity for a while, and they are now selling;
http://www.theinquirer.net/default.aspx?article=35677

It's not about to be making VPUs yet. And, yeah they aren't dropping their volumes on 90nm as quickly as was expected (shortages) so it's unlikely they'll be doing much of their own fab work other than mid-low range sometime next year.

As such nVidia and ATi are dependant on TSMC and sinilar having fabs ready at that gate size. 65nm just isnt there in volume yet.

True but really they've been on 65nm for about a year, and have been making memory for more than that, volume just depends on what you're measuring, but the R600 refresh will be produced in volume next year, and the G80 refresh (or refresh's refresh) soon after.
 

darkstar782

Distinguished
Dec 24, 2005
1,375
0
19,280
Dont forget ATi and nVidia do not have their own Fabs (well, AMD who own ATi do, but their best process is 90nm and they are already working at 100% capacity).

Actually AMD has had 65nm production capacity for a while, and they are now selling;
http://www.theinquirer.net/default.aspx?article=35677

It's not about to be making VPUs yet. And, yeah they aren't dropping their volumes on 90nm as quickly as was expected (shortages) so it's unlikely they'll be doing much of their own fab work other than mid-low range sometime next year.

Interesting, I knew AMD had engineering samples etc, but didnt know their 65nm process was ready for mass production yet, thanks :)

As such nVidia and ATi are dependant on TSMC and sinilar having fabs ready at that gate size. 65nm just isnt there in volume yet.

True but really they've been on 65nm for about a year, and have been making memory for more than that, volume just depends on what you're measuring, but the R600 refresh will be produced in volume next year, and the G80 refresh (or refresh's refresh) soon after.

Well, either TSMC are charging nVidia less for 90nm than they would have for 65nm, or they didnt offer nVidia that as an option as either the process wasnt mature enough for a 680million transistor GPU or they didnt have the 65nm capacity nVidia wanted.

If the process was mature enough, they had the capacity, and TSMC were willing to run 65mn for a similar or lower cost to nVidia, I find it hard to believe that we wouldnt have a 65nm G80 :)

The main advantages of a smaller process are faster clocks and lower costs due to more functional dies/wafer. We can only assume that the cost to nVidia was higher for 65nm or the performance was lower (unlikely)

Unless TSMCs 65nm process is subject to a higher failure rate than their 90nm process maybe? Either that or TSMC want to milk cash from their newer 65nm process and put much higher profit margins for themselves on it.