And this is how Nvidia falls off the high horse ... GTsuX 280

SirCrono

Distinguished
Sep 9, 2006
463
3
18,785
The last time the inq published real news was... What comes before never?

Don't trust such sensationalist sources and wait until official sources publish real reviews.
 

themyrmidon

Distinguished
Feb 27, 2008
187
0
18,680
I never put nV on a pedastal for G92 or GT200. I can admit G92 wasn't as good as claimed, but this is still early predictions from the Inq, probably the least reputable, AMD fanboy reporters out there. And your just another AMD/ATI fan clinging to it.
 

spuddyt

Distinguished
Jul 21, 2007
2,114
0
19,780
I almost hope it is true.... daamit needs some money..... (and i'm probably going to be a budget participant in this generation....
 

babybudha

Distinguished
Jul 17, 2006
257
0
18,780
Honestly, I'm a ATI support (almost to a fault), but I pursue truth first. This article seems alittle biased to me. They only talk about the negatives. I find it hard to believe that the card will be such a failure. As history shows, Nvidia cards tend to look weak on paper, but before well on real life. And vice versus with ATI lately.

Though I hope this is somewhat true, my gut tends to tell me otherwise.
 
Come on, Charlie Demerjian is obviously throwing a tantrum since he wasn't invited to the party. He's kicking, screaming and crying while spewing forth garbage. Wait to get the FACTS about the new GPUS, not this twaddle.
 

justjc

Distinguished
Jun 5, 2006
235
0
18,680
Well it is just a rumor from The Inq. so it's still to early to say how much of the article proves to be fact. That said I agree that ATi seems to have the better price/performance in the next generation.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
If nvidia gpu gt200 series has onboard physx than this will more than likely dramatically improve FPS in current CPU heavy titles like UT3, WiC and Crysis as the load will be taken off of the CPU for processings physics calculations.

I think in titles with less physics, the differences will be significantly less - and also if you have a very high end cpu (4.0 ghz ish overclock quad and the like) the difference will be less.


On an extremely high end system the gt200 will still be better than any g92 by a markable amount i'm sure - but on a lower end system I am willing to bet a dollar that the g260 mid range card will be dramatically better than current mid range cards. Actually if g260 is reasonably affordable than it will be a really excellent SLI setup!


On the other hand, 4870 is more than likely going to be an excellent single card solution for its cost; and I bet a crossfire 4870 will be very impressive.
 

royalcrown

Distinguished
Yeah, but devs would have to code for physX or whatever, so just having it means jack s***. I have a feeling that ATI is gonna get stomped again no matter the die size. Nvidia has done it with 6, 7, and 8 series..so the real 9 series ...it wouldn't floor me it it kicks ass.
 

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980
Nah we both know that the X1950XTX kicked the crap out of the 7900GTX and the 7950GX2. And the X19 had better IQ. Nvidia lost hands down in the 7900s.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280



admittedly true, but maybe next gen titles will better support this; or you never know really, maybe nvidia has worked out deals with crytek/massgate and the like to add the support with a patch to promote their new cards. Its totally possible, but lets wait until the cards are on the table right! =)
 
The 7 series was and still is kicked to the curb by the 19xx seires. Someone just needs to look. Im not even going to bother, as I wont waste my time. The G280 is going to be a killer card, but bring your pocket book and your watercooling.
 


And don't forget the X800 outperformed the GF6800, it relied on SM3.0 to make up the difference than the price/performance of the 6800GT (not the weak performance of the Ultra) to sell cards. The HD series has only been successful when learning from that GF6800GT strategy.

As for the GTX280 aka G200 (not GT200 like some people thought) it's too early to tell anything right now, but it's definitely going to be a huge die (thus expensive to make) and then a complex PCB and the added expense of the NVIO again, never going to be cheap.

I have no doubt it will outperform the HD48xx series, but they're definitely approaching the market from the two opposite ends of the equation. Relying on crippling this huge chip to make the GTX260 means that your still spending about the same amount to make that lower end card to compete against a significantly cheaper product. And they can't rely on the current G92 to be the cheap competition if the yields are low, they'll need to hope the yield of the G92B shrink are better, and the rumour that the move to 55nm for the G200/GTX280 will be quick to improve yields.
 
The 92b is nVidias true hope here. We all know that. Theyll be lucky to make a 20th of the money from the G2xx series as from the 92b,s. Whatll be interesting is whether nVidia can get the yields, up their clocks, and throw a few tweaks in to even compete with the 4xxx series
 

hannibal

Distinguished


Hmmm... Well it's in balance with what we know so far. The g200 series is big monster. Propably quite fast and very, very expensive to produce. So no new information in here. Are we getting shaky? So big GPU needs time to mature and needs smaller production node, to make it more affordable. But all in all what this article says is that 4800 is more mainstream and g280 more extreme card...

 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
My power supply couldn't even take a 280gtx or whatever. Nvidia's next chip sounds more and more like a power guzzling nuclear reactor than a video card.

I just hope AMD stay competitive. AMD will have a price segment where they will dominate at least unless Nvidia releases slower alternative or drop 260gtx prices.

Why didn't Nvidia plan a card with 384bit and 320bit bus along with 512bit? Nvidia knew AMD was going to stick to 16ROP config. It would have been much easier on the pockets and power consumption to compete with 4870 and still have the performance crown.
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


Nvidia actually won because they sold more video cards that is inferior. Thank god I skipped the entire 7 series. :p
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280
If Nvidia releases G92 with GDDR5 it would easily compete with 4850 and 4870 or whatever. AMD would be in a tight spot again and be out of luck in all price segments.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980


Why do you assume that?

Is there any evidence that theG92s currently bandwidth starved?


Your only as fast as your slowest area. If the G92(a) is a well balanced design (as it should be since it has evolved over 18 months or so) then speeding up one particular part of the card (in this case memory) will not result in a significant advance as the bottleneck will move to the other parts.