Is today the day we get official r600 benchmarks?

Twisted_Sister

Distinguished
Jan 20, 2007
573
0
18,980
Someone posted that ATI is sending out samples today of the new cards... will we see real benchmarks soon?

Any idea for price announcements?

Also, is it possible to buy the OEM version and save some $$ and noise? (yes I know it's 12 inches).

Thanks!
 
Someone posted that ATI is sending out samples today of the new cards... will we see real benchmarks soon?

Well likely ATi shipped their Cards out today to Darren (or Maybe Cleeve if We/He are lucky), and then the Review would happen after the NDA expires.

Any idea for price announcements?

Also, is it possible to buy the OEM version and save some $$ and noise? (yes I know it's 12 inches).

Things like that you will definitely have to wait for especially if the GF8900 rumours are even close to correct.
I doubt you'll get your hands on an OEM version cheap early on without them coming in a case. Remember AMD's availability issues due to DELL, that would likely be the situation early on. But of course the only way we'll know is once they start shipping/selling them.
 

Twisted_Sister

Distinguished
Jan 20, 2007
573
0
18,980
Someone posted that ATI is sending out samples today of the new cards... will we see real benchmarks soon?

Well likely ATi shipped their Cards out today to Darren (or Maybe Cleeve if We/He are lucky), and then the Review would happen after the NDA expires.

Any idea for price announcements?

Also, is it possible to buy the OEM version and save some $$ and noise? (yes I know it's 12 inches).

Things like that you will definitely have to wait for especially if the GF8900 rumours are even close to correct.
I doubt you'll get your hands on an OEM version cheap early on without them coming in a case. Remember AMD's availability issues due to DELL, that would likely be the situation early on. But of course the only way we'll know is once they start shipping/selling them.

Thanks... but what are the GF8900 rumors (coundn't find any threads on this)? Please enlighten!
 

cleeve

Illustrious
The buzz at the Inquirer is that the existing 8800s are all pre-crippled, and that they are only using 75% of the shaders on the GPU die.

The 8900 will use all of them... 25% more shaders.

Frankly, that seems pretty far fetched to me. But who knows?
 

Whizzard9992

Distinguished
Jan 18, 2006
1,076
0
19,280
Is it possible that nVidia is taking the same 'redundancy' approach to G80 that IBM took with the cell?

In other words, is it possible that they disable faulty shaders to increase the yield (8800), and are stockpiling chips that have few/none faulty shaders (8900)?

It might make more sense now that the shaders are unified *edit*
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Seems like a possibility, but I think information that critical would have been leaked by now. Especially the amount of pissed off customers you would have from using crippled chips.
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
Someone posted that ATI is sending out samples today of the new cards... will we see real benchmarks soon?

Well likely ATi shipped their Cards out today to Darren (or Maybe Cleeve if We/He are lucky), and then the Review would happen after the NDA expires.

Any idea for price announcements?

Also, is it possible to buy the OEM version and save some $$ and noise? (yes I know it's 12 inches).

Things like that you will definitely have to wait for especially if the GF8900 rumours are even close to correct.
I doubt you'll get your hands on an OEM version cheap early on without them coming in a case. Remember AMD's availability issues due to DELL, that would likely be the situation early on. But of course the only way we'll know is once they start shipping/selling them.

Thanks... but what are the GF8900 rumors (coundn't find any threads on this)? Please enlighten!

I also want to know SOMETHING about the 8900. What does the 8900 rumours have to do with "have to wait" ?
Thanks..
 

TucsonPi

Distinguished
Jan 21, 2007
31
0
18,530
Seems like a possibility, but I think information that critical would have been leaked by now. Especially the amount of pissed off customers you would have from using crippled chips.

In what way are they crippled if the customers got exactly what was advertised? It's not like they don't dominate the videocard landscape currently. It's not like they're using less shaders than are on the box.
 

cleeve

Illustrious
Is it possible that nVidia is taking the same 'redundancy' approach to G80 that IBM took with the cell?

Anything's possible. I just find it hard to believe they'd donate 25% of their shader die space to unusable silicon. That doesn't seem like a profitable move.

But who knows other than Nvidia? I don't have their numbers. Maybe it makes perfect sense. It seems a bit strange to me though.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
Is it possible that nVidia is taking the same 'redundancy' approach to G80 that IBM took with the cell?

In other words, is it possible that they disable faulty shaders to increase the yield (8800), and are stockpiling chips that have few/none faulty shaders (8900)?

hmm, if true maybe they could be mod enabled like it has been done with so many other shader redundant disabled cards...
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Seems like a possibility, but I think information that critical would have been leaked by now. Especially the amount of pissed off customers you would have from using crippled chips.

In what way are they crippled if the customers got exactly what was advertised? It's not like they don't dominate the videocard landscape currently. It's not like they're using less shaders than are on the box.

I'm saying pissed in the "potential" sense. What Cleeve said; shader space for unused silicon. Its the fact that its on-die, and not working that would peeve some folkes off. I wasnt hinting about what was advertised. Your completely right in that state.

@ Warez

doubt that: both companies have learned their lessons. I believe the last successful modding came out of the X1800GTO--> XL or small 3rd party cards. Other than that, I"m sure their lazer cut properties. so unless you have some microscopic tools and a machine shop on hand, I dont see it happening.
 

mpjesse

Splendid
The buzz at the Inquirer is that the existing 8800s are all pre-crippled, and that they are only using 75% of the shaders on the GPU die.

The 8900 will use all of them... 25% more shaders.

Frankly, that seems pretty far fetched to me. But who knows?

A program I downloaded a while back (the name escapes me) said that my card isn't using all of the available shaders. But it didn't tell me how many weren't being used. So there could be some truth to this.

(I own a 8800 GTX)
 

tamalero

Distinguished
Oct 25, 2006
1,133
138
19,470
Seems like a possibility, but I think information that critical would have been leaked by now. Especially the amount of pissed off customers you would have from using crippled chips.

In what way are they crippled if the customers got exactly what was advertised? It's not like they don't dominate the videocard landscape currently. It's not like they're using less shaders than are on the box.

I'm saying pissed in the "potential" sense. What Cleeve said; shader space for unused silicon. Its the fact that its on-die, and not working that would peeve some folkes off. I wasnt hinting about what was advertised. Your completely right in that state.

@ Warez

doubt that: both companies have learned their lessons. I believe the last successful modding came out of the X1800GTO--> XL or small 3rd party cards. Other than that, I"m sure their lazer cut properties. so unless you have some microscopic tools and a machine shop on hand, I dont see it happening.

so.. you say you would recommend nvidia to trash all those chips with a small percentage of defective shaders?
nah, id rather go the AMD or Intel way, disable a part, and sell, that way you still get MONEY :p
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
The buzz at the Inquirer is that the existing 8800s are all pre-crippled, and that they are only using 75% of the shaders on the GPU die.

The 8900 will use all of them... 25% more shaders.

Frankly, that seems pretty far fetched to me. But who knows?

A program I downloaded a while back (the name escapes me) said that my card isn't using all of the available shaders. But it didn't tell me how many weren't being used. So there could be some truth to this.

(I own a 8800 GTX)It's hard to imagine Nvidia disabling 25% of the shaders.
 

Yrusoad

Distinguished
Feb 6, 2007
77
0
18,630
we are not going to see any benchmarks till it is all ready out.
and im guessing $550-$600 for the premium and $250-$300 for the "Pro" version
 
I here tell the 8900 is just a 80nm refresh....perhaps I'm....wrong?

Well according to the other BUZZ out there, the GF8900 wil be 90nm again, and nV is going to skip the 80nm half-node and jump straight to the 65nm for the G90. That is the current Buzz about that portion.

As for what I was referencing, Cleeve got it with the 25% hidden, which may or may not be the case (how many people talked about hidden GF7800 pipes to be unleashed [to 32] in reply to the X1800? but it turned out nope, just more speed and memory). Like I said we'll have to wait and see about that.

Also alot of the theoretical comparisons of architecture came from the 64 shader R600 #s, but if the recent specs are correct and the R600 is also a 128 shader part then that could change things, but also is it still dual issue, is it still Vec4?

There's many unanswered questions that really at this point just start spinning out of control the closer we get to launch (like the Hybrid G80 and the unified 24+8/8 R520).

I do find it funny though that really no one outside of the IHVs know JACK about the true inner workings of their hardware. Even Unwinder with his great Rivatuner tool didn't detect hidden components to the G80, so if they were there how would anyone even know, even with an XRAY?
ATi could come out with a die map that showed 12 shaders, if that's all they needed to enable to get the job done, and no one would know what the chip has despite logic and such because the numbers never translate exactly 1 to 1. Look at the GF7800 to GF7900 they reduced the transistor count without changing the level of functionality, well something had to give, what? No one seems to know and nV's not telling, even a year later it's not a known change, although I suspect it's legacy partial precision stuff they knew they didn't need once the GF7800 was a success.

So speculation will run rampant, and we can tell alot about what will be exposed, but likely there's secrets all over these things. Like the added AA modes in the last generations, not everything is exposed at launch.
 
Anything's possible. I just find it hard to believe they'd donate 25% of their shader die space to unusable silicon. That doesn't seem like a profitable move.

But who knows other than Nvidia? I don't have their numbers. Maybe it makes perfect sense. It seems a bit strange to me though.

I find it 'unlikely' too, but not 'impossible', and it would make sense in one scenario to me.

nV makes the first G80, find is has more than enough power, and that they could easily beat the competition with 75% (GTX) and 50% (GTS) of the shaders enabled. This has a twofold benefit, especially for a long term strategy, you can increase your yields (which helps for a hard launch especially) and you also don't have to design a completely new replacement part, you can push back the next new design (either skip 80nm and go to 65m like they're doing, or simply slow the pace of launches). Both of these actually help increase profits, likely significantly. And if you think about it, if they've been building up a stock of 100% chips since just before the GF8800 launch, then you can slowly ramp up enough for a hard GF8900 launch as well.

If it's true, it's truely a brilliant move, and a demonstration of something we rarely see in this marketplace, restraint. Although, it would be similar to the move to delay the NV47/48 and turn it into the GF7800, except for this would be something you could produce while benefiting from as well, instead of simply delaying the launch. I'd be surprised if it's true, but if it is, then nV has a winner in their G80 as a product family, as long as ther's no features issues when DX10 comes out, which is once again another wait and see scenario.

Anywhoo, hope what made sense as to a possible explanation of something that might turn out to be total BS. :wink:
 

enewmen

Distinguished
Mar 6, 2005
2,249
5
19,815
Anything's possible. I just find it hard to believe they'd donate 25% of their shader die space to unusable silicon. That doesn't seem like a profitable move.

But who knows other than Nvidia? I don't have their numbers. Maybe it makes perfect sense. It seems a bit strange to me though.

I find it 'unlikely' too, but not 'impossible', and it would make sense in one scenario to me.

nV makes the first G80, find is has more than enough power, and that they could easily beat the competition with 75% (GTX) and 50% (GTS) of the shaders enabled. This has a twofold benefit, especially for a long term strategy, you can increase your yields (which helps for a hard launch especially) and you also don't have to design a completely new replacement part, you can push back the next new design (either skip 80nm and go to 65m like they're doing, or simply slow the pace of launches). Both of these actually help increase profits, likely significantly. And if you think about it, if they've been building up a stock of 100% chips since just before the GF8800 launch, then you can slowly ramp up enough for a hard GF8900 launch as well.

If it's true, it's truely a brilliant move, and a demonstration of something we rarely see in this marketplace, restraint. Although, it would be similar to the move to delay the NV47/48 and turn it into the GF7800, except for this would be something you could produce while benefiting from as well, instead of simply delaying the launch. I'd be surprised if it's true, but if it is, then nV has a winner in their G80 as a product family, as long as ther's no features issues when DX10 comes out, which is once again another wait and see scenario.

Anywhoo, hope what made sense as to a possible explanation of something that might turn out to be total BS. :wink:

If this theory is right, then the other 25% can be unlocked right after the R600 comes out and nVidia stays on top. This way Nvidia can also adapt quickly.
 
If this theory is right, then the other 25% can be unlocked right after the R600 comes out and nVidia stays on top. This way Nvidia can also adapt quickly.

That's assuming that ATi isn't hiding something, and/or that nV can make up the difference with a 25% increase in shaders (might still be largely memory or CPU bound).
Either way if true it will make for a very interesting launch and reply, which really should benefit us the consumers. 8)