New 9600GT Benchmarked

rallyimprezive

Distinguished
Jul 18, 2007
470
0
18,780
Wow nVidia is SO lost with their naming right now.

The 9600GT has less performance than the 8800GT and 8800GTS.

I would assume that the 9 series would be better than the 8 series? Sheesh.

I dont see what market segment this applies to that isnt already accounted for...
 

PlasticSashimi

Distinguished
Jan 11, 2008
149
0
18,680
what a POS
give me a card that can do Crysis at 1920x1080, maxed settings, 4x FSAA and 16x AF @ above 40 fps solid. (and I don't mean any 3 card 1000 watt solution either)

Anything less, we already have access to...
 

dos1986

Distinguished
Mar 21, 2006
542
0
18,980
Cmon.......

That's actually really good. I can only imagine what the 9800 is gonna be like

If the 9600gt can get 15fps on Crysis full details at 1280 by 1024,8800gts 512mb 20fps,then going by recent history the 9800gtx should crush it...

This is good news
 

HoldDaMayo

Distinguished
Aug 2, 2007
25
0
18,530


Why would you assume the low-mid range card would be better than previous generation top performers?

First digit = series 2nd digit = performance level
Add in GTX, GTS, GT, GS, to denote performance within each model of the series...

Is it really that hard to understand?
 

pauldh

Illustrious
Yeah, agreed. An 8600GT would very rarely beat a 7900GT, other times it would get crushed. I wouldn't have expected the 9600GT to beat the 8800GT.

The 7600GT did beat the 6800GT though and traded blows with the 6800U. And the 6600GT beat everything FX5xxx. But those days seem to be gone. The 9600GT may end up doing pretty well against the 8800GTS 320MB though.
 

Gravemind123

Distinguished
Aug 10, 2006
649
0
18,980


To be fair, the GeForce FX series was awful, it didn't take much to beat the whole line up. The 8600GT and 9600GT aren't exciting when compared to how good the 7600GT was, it at least was up to par with the last generation highend. This used to be how the mid-end was, but not anymore. ATI side was similar, X1650XT is near the X850XT, lower model X800s(GT, SE) and the X700 Pro are near the 9800Pro and XT.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

Is the 9800GX2 DX10.1 compatible? If so that would mean that either:

A) It uses a new core(s), not g92 as most have speculated, or

B) g92 actually supports DX10.1

Of course there's always C) 9800GX2 doesn't support DX10.1 :)
 


You forgot:

C) who cares about DX10.1 anyway?
 

righteous

Distinguished
Oct 25, 2006
197
0
18,680


I totally agree.

I'm so sick of these marketing idiots with this "low, mid, high-end" pricing scheme crap, where unless you pay 500 dollars all you get is some brand name attached to a crippled POS that can barely play dvds without stuttering.

Then you have all the gimmick cards with the fancy coolers that amount to nothing for 50 bucks over what they already arent worth.

Make a card that is a single chip that doesn't suck. And quit trying to prolong the current obsolete products by stacking them together in SLI/crossfire like that's cutting edge or something. It's a copout for the lack of speedy development and competence.

People think they have some great platform with sli/crossfire, yet all they are doing is taking 2 or now three obsolete pieces of junk at three times the price of what they were already overcharged for.....to play crysis at 40 fps? no.

Until someone develops a single chip that can handle crysis, everything on the market is obsolete subpar crap.

9600 gt = so what? :lol:

 

pauldh

Illustrious

true, the FX series was terrible. But, the 6600GT beat the very best card available from the generation before, the radeon 9800XT. So it did what the 7600GT couldn't even do (beat the Radeon X850XTPE the same way). And yeah, the 8600GT and even GTS didn't come close to beating the 7900GTX let alone the X1950XTX.

 

pauldh

Illustrious
Until someone develops a single chip that can handle crysis, everything on the market is obsolete subpar crap.
Did you guys cry like this when farcry and Oblivion came out? I am glad crytec pushed the limits of current hardware. So this one game makes an 8800GTX obsolete?
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
Good pont Pauldh, Crysis doesn't change my 8800GTX from what it is....an excellent card that rarely any game can challenge, why does everybody seem to think the GPUs out right now aren't up to par ? And I agree Xfire, and SLI (espescially) is a waste of money, unless you got it to spend.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

Hey, I already used C. That makes a good point D though. Actually that should be point A :lol:
 

rallyimprezive

Distinguished
Jul 18, 2007
470
0
18,780



Yes it is. I am exceptionally stupid.
 

rallyimprezive

Distinguished
Jul 18, 2007
470
0
18,780


Thats what I was hoping to see with this one.

I just dont see any reason to release a "new" card with no increase over current offers.
 

IndigoMoss

Distinguished
Nov 30, 2007
571
0
18,980
Oblivion was pretty damn nice on the 7900 and x1900 series cards. I'd compare it more with when Doom 3 was released. That was a monster of a game when it came out. The 6800 was the only card even capable of running it decently when it first came out, luckily a few weeks later the 6600 series came out.

All I know is they sure don't make mid-range like they used to. The 6600 series was awesome, same thing with the 7600 series, but the 8600 series is donkey poo; at least the 9600 series has performance increases unlike it's older brother the 8600 series.
 

Dahak

Distinguished
Mar 26, 2006
1,267
0
19,290
Actualy it was the 8800GS,not the 8800GT.The GT kicks the GS in the butt,at least the older ones.But if the 9600 works that well,I guess it's kind of like when the 7600GT came out,it actually clobbered the 6800gt which had four more pipelines and a 256bit memory interface.I believe we'll see a marked improvment with the 9800GT.

Dahak

M2N32-SLI DELUXE WE
X2 5600+ STOCK (2.8GHZ)
2X1GIG DDR2 800 IN DC MODE
TOUGHPOWER 850WATT PSU
EVGA 8800GT SUPERCLOCKED
SMILIDON RAIDMAX GAMING CASE
ACER 22IN WS LCD 1680X1050
250GIG HD/320GIG HD
G5 GAMING MOUSE
LOGITECH Z-5500 5.1 SURROUND SYSTEM
500WATS CONTINUOUS,1000 PEAK
WIN XP MCE SP2
3DMARK05 15,686
3DMARK06 10,588
 

neisonator

Distinguished
Oct 27, 2007
150
0
18,680
i think nvidia has realy stuffed up their naming sceme i think the 8800 gt and gts should have been the 8900 gt and gts that would make things much easyier
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780


Their naming convention makes sense. What they did this time around is come out with the 9xxx series to replace the underperforming low end first. I'm sure they'll come out with a 9800GT and a 9800GS. What amuses me is that with the numbering convention, they'll going to come out with their own 9800 to beat ATI, but not by all that much.

Can't wait to see a comparison between the two dual GPU cards! Not that I'll be able to afford either of them, I just like to read the reviews.



Yes, and just like with AMD vs. Intel today, we saw Nvidia have higher sales back then. That always made me wonder. I will be getting a 9600 for the sole PC we have with a finicky MSI KN9 405 chipset board that doesn't like ATI cards due to "chipset limitations". Right now, it has a 7600GS. That's the only motherboard I ever bought with a caveat that it couldn't use the other company's cards.
 

kpo6969

Distinguished
May 6, 2007
1,144
0
19,290
What Nvidia did with the 8600 series is the cause I believe of the #ing scheme confusion.
Everyone expected the 8600GT to be the "new" 7600GT and it wasn't.
The 8800GT is what the 8600GT should of been, which leads to what we have now? maybe, just a thought.
 

pauldh

Illustrious

X19xx it was nice (but not maxed), GF7 not at all. My 7800GT was downright pitiful at Oblivion and quickly got replaced by an X1800XT.First no aa/HDR at once on any GF7's. Second they tanked in the outdoor foliage. My point being nothing out could come close to maxing out Oblivion with aa/af when the game was released. The X1900XTX was the best single Oblivion card, but it couldn't max Oblivion. X19xx owbers had to tweak their settings to play. [H]ardocp somehow decided that the X19xx cards needed grass turned off completely ( a joke to me) but they kept other settings high that I reduced with an X1950XT.

Have a look at the GF7's in Oblivion:

Anandtech was running high not MAX details and no fsaa of course and a single 7900GTX averaged 29 fps with a low of 19 fps at 12x10.
http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4

Firingsquad's Foliage test at max details, the 7900GTX averages 24 fps at 12x10 and of course no aa.
http://www.firingsquad.com/hardware/oblivion_high-end_performance/page5.asp


Averages of 24 fps at 12x10 and no fsaa kinda explains my point. Nowadays FS still uses Oblivion but only tests maxed out with 4xaa/16xaf. That eliminates the GF7's from those tests, but as you can see, current high end cards are now able to max out Oblivion. But still, even the 640MB 8800GTS and HD2900XT drop below 30 average when they crank the resolution.
http://www.firingsquad.com/hardware/nvidia_geforce_8800_gt_performance/page7.asp

But yeah, I don't see in a years time anything doing this well in Crysis.