Sign in with
Sign up | Sign in
Your question

New 9600GT Benchmarked

Last response: in Graphics & Displays
Share
January 16, 2008 6:49:58 PM

http://www.eggxpert.com/forums/thread/246907.aspx

Not bad, major improvement on the 8600gt...

How much/when will they be available?

More about : 9600gt benchmarked

January 16, 2008 7:33:36 PM

Wow nVidia is SO lost with their naming right now.

The 9600GT has less performance than the 8800GT and 8800GTS.

I would assume that the 9 series would be better than the 8 series? Sheesh.

I dont see what market segment this applies to that isnt already accounted for...
January 16, 2008 8:03:01 PM

what a POS
give me a card that can do Crysis at 1920x1080, maxed settings, 4x FSAA and 16x AF @ above 40 fps solid. (and I don't mean any 3 card 1000 watt solution either)

Anything less, we already have access to...
Related resources
January 16, 2008 9:46:53 PM

Cmon.......

That's actually really good. I can only imagine what the 9800 is gonna be like

If the 9600gt can get 15fps on Crysis full details at 1280 by 1024,8800gts 512mb 20fps,then going by recent history the 9800gtx should crush it...

This is good news
a b U Graphics card
January 16, 2008 10:23:01 PM

^Looks like we might finally be able to play Crysis at Ultra ;) 
January 16, 2008 10:41:08 PM

rallyimprezive said:
Wow nVidia is SO lost with their naming right now.

The 9600GT has less performance than the 8800GT and 8800GTS.

I would assume that the 9 series would be better than the 8 series? Sheesh.

I dont see what market segment this applies to that isnt already accounted for...


Why would you assume the low-mid range card would be better than previous generation top performers?

First digit = series 2nd digit = performance level
Add in GTX, GTS, GT, GS, to denote performance within each model of the series...

Is it really that hard to understand?
a b U Graphics card
January 16, 2008 11:07:16 PM

Yeah, agreed. An 8600GT would very rarely beat a 7900GT, other times it would get crushed. I wouldn't have expected the 9600GT to beat the 8800GT.

The 7600GT did beat the 6800GT though and traded blows with the 6800U. And the 6600GT beat everything FX5xxx. But those days seem to be gone. The 9600GT may end up doing pretty well against the 8800GTS 320MB though.
January 16, 2008 11:30:02 PM

I think the DX 10.0 models will have the 8 # and the DX 10.1 models will have the 9#
January 16, 2008 11:43:28 PM

pauldh said:
Yeah, agreed. An 8600GT would very rarely beat a 7900GT, other times it would get crushed. I wouldn't have expected the 9600GT to beat the 8800GT.

The 7600GT did beat the 6800GT though and traded blows with the 6800U. And the 6600GT beat everything FX5xxx. But those days seem to be gone. The 9600GT may end up doing pretty well against the 8800GTS 320MB though.


To be fair, the GeForce FX series was awful, it didn't take much to beat the whole line up. The 8600GT and 9600GT aren't exciting when compared to how good the 7600GT was, it at least was up to par with the last generation highend. This used to be how the mid-end was, but not anymore. ATI side was similar, X1650XT is near the X850XT, lower model X800s(GT, SE) and the X700 Pro are near the 9800Pro and XT.
January 17, 2008 12:03:27 AM

cah027 said:
I think the DX 10.0 models will have the 8 # and the DX 10.1 models will have the 9#

Is the 9800GX2 DX10.1 compatible? If so that would mean that either:

A) It uses a new core(s), not g92 as most have speculated, or

B) g92 actually supports DX10.1

Of course there's always C) 9800GX2 doesn't support DX10.1 :) 
January 17, 2008 12:17:32 AM

homerdog said:
Is the 9800GX2 DX10.1 compatible? If so that would mean that either:

A) It uses a new core(s), not g92 as most have speculated, or

B) g92 actually supports DX10.1

Of course there's always C) 9800GX2 doesn't support DX10.1 :) 


You forgot:

C) who cares about DX10.1 anyway?
January 17, 2008 12:42:29 AM

PlasticSashimi said:
what a POS
give me a card that can do Crysis at 1920x1080, maxed settings, 4x FSAA and 16x AF @ above 40 fps solid. (and I don't mean any 3 card 1000 watt solution either)

Anything less, we already have access to...


I totally agree.

I'm so sick of these marketing idiots with this "low, mid, high-end" pricing scheme crap, where unless you pay 500 dollars all you get is some brand name attached to a crippled POS that can barely play dvds without stuttering.

Then you have all the gimmick cards with the fancy coolers that amount to nothing for 50 bucks over what they already arent worth.

Make a card that is a single chip that doesn't suck. And quit trying to prolong the current obsolete products by stacking them together in SLI/crossfire like that's cutting edge or something. It's a copout for the lack of speedy development and competence.

People think they have some great platform with sli/crossfire, yet all they are doing is taking 2 or now three obsolete pieces of junk at three times the price of what they were already overcharged for.....to play crysis at 40 fps? no.

Until someone develops a single chip that can handle crysis, everything on the market is obsolete subpar crap.

9600 gt = so what? :lol: 

a b U Graphics card
January 17, 2008 1:14:15 AM

Gravemind123 said:
To be fair, the GeForce FX series was awful, it didn't take much to beat the whole line up. The 8600GT and 9600GT aren't exciting when compared to how good the 7600GT was, it at least was up to par with the last generation highend. This used to be how the mid-end was, but not anymore. ATI side was similar, X1650XT is near the X850XT, lower model X800s(GT, SE) and the X700 Pro are near the 9800Pro and XT.

true, the FX series was terrible. But, the 6600GT beat the very best card available from the generation before, the radeon 9800XT. So it did what the 7600GT couldn't even do (beat the Radeon X850XTPE the same way). And yeah, the 8600GT and even GTS didn't come close to beating the 7900GTX let alone the X1950XTX.

a b U Graphics card
January 17, 2008 1:17:45 AM

Quote:
Until someone develops a single chip that can handle crysis, everything on the market is obsolete subpar crap.

Did you guys cry like this when farcry and Oblivion came out? I am glad crytec pushed the limits of current hardware. So this one game makes an 8800GTX obsolete?
a b U Graphics card
January 17, 2008 2:22:01 AM

Good pont Pauldh, Crysis doesn't change my 8800GTX from what it is....an excellent card that rarely any game can challenge, why does everybody seem to think the GPUs out right now aren't up to par ? And I agree Xfire, and SLI (espescially) is a waste of money, unless you got it to spend.
January 17, 2008 2:28:22 AM

skittle said:
You forgot:

C) who cares about DX10.1 anyway?

Hey, I already used C. That makes a good point D though. Actually that should be point A :lol: 
January 17, 2008 3:02:35 AM

HoldDaMayo said:
Why would you assume the low-mid range card would be better than previous generation top performers?

First digit = series 2nd digit = performance level
Add in GTX, GTS, GT, GS, to denote performance within each model of the series...

Is it really that hard to understand?



Yes it is. I am exceptionally stupid.
January 17, 2008 3:03:53 AM

pauldh said:


The 7600GT did beat the 6800GT though and traded blows with the 6800U. And the 6600GT beat everything FX5xxx. But those days seem to be gone. The 9600GT may end up doing pretty well against the 8800GTS 320MB though.


Thats what I was hoping to see with this one.

I just dont see any reason to release a "new" card with no increase over current offers.
January 17, 2008 3:20:49 AM

Oblivion was pretty damn nice on the 7900 and x1900 series cards. I'd compare it more with when Doom 3 was released. That was a monster of a game when it came out. The 6800 was the only card even capable of running it decently when it first came out, luckily a few weeks later the 6600 series came out.

All I know is they sure don't make mid-range like they used to. The 6600 series was awesome, same thing with the 7600 series, but the 8600 series is donkey poo; at least the 9600 series has performance increases unlike it's older brother the 8600 series.
January 17, 2008 3:47:53 AM

Actualy it was the 8800GS,not the 8800GT.The GT kicks the GS in the butt,at least the older ones.But if the 9600 works that well,I guess it's kind of like when the 7600GT came out,it actually clobbered the 6800gt which had four more pipelines and a 256bit memory interface.I believe we'll see a marked improvment with the 9800GT.

Dahak

M2N32-SLI DELUXE WE
X2 5600+ STOCK (2.8GHZ)
2X1GIG DDR2 800 IN DC MODE
TOUGHPOWER 850WATT PSU
EVGA 8800GT SUPERCLOCKED
SMILIDON RAIDMAX GAMING CASE
ACER 22IN WS LCD 1680X1050
250GIG HD/320GIG HD
G5 GAMING MOUSE
LOGITECH Z-5500 5.1 SURROUND SYSTEM
500WATS CONTINUOUS,1000 PEAK
WIN XP MCE SP2
3DMARK05 15,686
3DMARK06 10,588
January 17, 2008 6:23:45 AM

i think nvidia has realy stuffed up their naming sceme i think the 8800 gt and gts should have been the 8900 gt and gts that would make things much easyier
January 17, 2008 7:01:25 AM

rallyimprezive said:
Wow nVidia is SO lost with their naming right now.

The 9600GT has less performance than the 8800GT and 8800GTS.

I would assume that the 9 series would be better than the 8 series? Sheesh.

I dont see what market segment this applies to that isnt already accounted for...


Their naming convention makes sense. What they did this time around is come out with the 9xxx series to replace the underperforming low end first. I'm sure they'll come out with a 9800GT and a 9800GS. What amuses me is that with the numbering convention, they'll going to come out with their own 9800 to beat ATI, but not by all that much.

Can't wait to see a comparison between the two dual GPU cards! Not that I'll be able to afford either of them, I just like to read the reviews.

Gravemind123 said:
To be fair, the GeForce FX series was awful, it didn't take much to beat the whole line up. The 8600GT and 9600GT aren't exciting when compared to how good the 7600GT was, it at least was up to par with the last generation highend. This used to be how the mid-end was, but not anymore. ATI side was similar, X1650XT is near the X850XT, lower model X800s(GT, SE) and the X700 Pro are near the 9800Pro and XT.


Yes, and just like with AMD vs. Intel today, we saw Nvidia have higher sales back then. That always made me wonder. I will be getting a 9600 for the sole PC we have with a finicky MSI KN9 405 chipset board that doesn't like ATI cards due to "chipset limitations". Right now, it has a 7600GS. That's the only motherboard I ever bought with a caveat that it couldn't use the other company's cards.
January 17, 2008 7:17:19 AM

What Nvidia did with the 8600 series is the cause I believe of the #ing scheme confusion.
Everyone expected the 8600GT to be the "new" 7600GT and it wasn't.
The 8800GT is what the 8600GT should of been, which leads to what we have now? maybe, just a thought.
a b U Graphics card
January 17, 2008 11:26:52 AM

IndigoMoss said:
Oblivion was pretty damn nice on the 7900 and x1900 series cards. I'd compare it more with when Doom 3 was released. That was a monster of a game when it came out. The 6800 was the only card even capable of running it decently when it first came out, luckily a few weeks later the 6600 series came out.

All I know is they sure don't make mid-range like they used to. The 6600 series was awesome, same thing with the 7600 series, but the 8600 series is donkey poo; at least the 9600 series has performance increases unlike it's older brother the 8600 series.

X19xx it was nice (but not maxed), GF7 not at all. My 7800GT was downright pitiful at Oblivion and quickly got replaced by an X1800XT.First no aa/HDR at once on any GF7's. Second they tanked in the outdoor foliage. My point being nothing out could come close to maxing out Oblivion with aa/af when the game was released. The X1900XTX was the best single Oblivion card, but it couldn't max Oblivion. X19xx owbers had to tweak their settings to play. [H]ardocp somehow decided that the X19xx cards needed grass turned off completely ( a joke to me) but they kept other settings high that I reduced with an X1950XT.

Have a look at the GF7's in Oblivion:

Anandtech was running high not MAX details and no fsaa of course and a single 7900GTX averaged 29 fps with a low of 19 fps at 12x10.
http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4

Firingsquad's Foliage test at max details, the 7900GTX averages 24 fps at 12x10 and of course no aa.
http://www.firingsquad.com/hardware/oblivion_high-end_p...


Averages of 24 fps at 12x10 and no fsaa kinda explains my point. Nowadays FS still uses Oblivion but only tests maxed out with 4xaa/16xaf. That eliminates the GF7's from those tests, but as you can see, current high end cards are now able to max out Oblivion. But still, even the 640MB 8800GTS and HD2900XT drop below 30 average when they crank the resolution.
http://www.firingsquad.com/hardware/nvidia_geforce_8800...

But yeah, I don't see in a years time anything doing this well in Crysis.
January 17, 2008 11:53:30 AM

pauldh said:
Quote:
Until someone develops a single chip that can handle crysis, everything on the market is obsolete subpar crap.

Did you guys cry like this when farcry and Oblivion came out? I am glad crytec pushed the limits of current hardware. So this one game makes an 8800GTX obsolete?


Uh, it makes ALL 8-Series cards obsolete including the Ultra's, even in SLI mode, BUT only at the uber highest settings and resolutions. 1920x1080 is still quite playable and enjoyable, anything higher...forget it.

Not everyone games at the highest settings or resolutions though. I have seen people play Crysis on a Dell M1330 notebook with the 8400GS in low resolution but it was cruising along just fine, otherwise.
January 17, 2008 4:48:35 PM

pauldh said:
X19xx it was nice (but not maxed), GF7 not at all. My 7800GT was downright pitiful at Oblivion and quickly got replaced by an X1800XT.First no aa/HDR at once on any GF7's. Second they tanked in the outdoor foliage. My point being nothing out could come close to maxing out Oblivion with aa/af when the game was released. The X1900XTX was the best single Oblivion card, but it couldn't max Oblivion. X19xx owbers had to tweak their settings to play. [H]ardocp somehow decided that the X19xx cards needed grass turned off completely ( a joke to me) but they kept other settings high that I reduced with an X1950XT.

Have a look at the GF7's in Oblivion:

Anandtech was running high not MAX details and no fsaa of course and a single 7900GTX averaged 29 fps with a low of 19 fps at 12x10.
http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4

Firingsquad's Foliage test at max details, the 7900GTX averages 24 fps at 12x10 and of course no aa.
http://www.firingsquad.com/hardware/oblivion_high-end_p...


Averages of 24 fps at 12x10 and no fsaa kinda explains my point. Nowadays FS still uses Oblivion but only tests maxed out with 4xaa/16xaf. That eliminates the GF7's from those tests, but as you can see, current high end cards are now able to max out Oblivion. But still, even the 640MB 8800GTS and HD2900XT drop below 30 average when they crank the resolution.
http://www.firingsquad.com/hardware/nvidia_geforce_8800...

But yeah, I don't see in a years time anything doing this well in Crysis.


Your right about the 7 series and oblivion. My old 7900gt oc'd to 700/850 had a minimum frame rate outside of like 17fps full detail with hdr (and yeah, no aa) @1400x900, on average it was like 21-24 fps. Shockingly thats only about 6 fps quicker than my oc'd 7600gt was @1152x864. That was running at something like 620mhz too. A wide scene with simulated foliage seemed to really upset the 7 series. Stalker didnt respond to overclocking that well either and had low mins with 3/4 grass and shadows all else at max. Thats when I longed for an x1900 lol.
January 17, 2008 5:47:33 PM

The 7800/7900 was good when it came out, but in more recent games it proves to be rather shader deficient.
January 17, 2008 8:26:31 PM

kpo6969 said:
What Nvidia did with the 8600 series is the cause I believe of the #ing scheme confusion.
Everyone expected the 8600GT to be the "new" 7600GT and it wasn't.
The 8800GT is what the 8600GT should of been, which leads to what we have now? maybe, just a thought.

I found this to be rather amusing:

http://www.gpureview.com/msi-puts-ddr2-in-8600-gts-call...
a b U Graphics card
January 17, 2008 8:54:55 PM

I agree that the naming schemes seem geared at one thing, confusing consumers. Somebody who doesn't visit hardware sites and read the trade magazines would have no clue what is going on. Even those of us who do...It's a #@$&ing mess!
I for one was completely underwhelmed by the 8600 gt and gts-I agree that the 9600 sounds like more of a midrange card.
January 17, 2008 9:10:42 PM

For a midrange card... doesn't look bad to me. I'd probably buy one if the price is mid range...
January 17, 2008 9:31:37 PM

.........i love how some of you are complaining about the performance even though the card is mid-range. For a friggin mid-range card that is AFFORDABLE that is a pretty good card. Its at least on par with the 3870. The 9800 cards will obviously be much better
January 18, 2008 12:54:47 AM

HoldDaMayo said:
Why would you assume the low-mid range card would be better than previous generation top performers?

First digit = series 2nd digit = performance level
Add in GTX, GTS, GT, GS, to denote performance within each model of the series...

Is it really that hard to understand?


doesnt really stand true for the 8800gts 320 and the 8800gt now though... the fact that there are like 4 different versions of the 8800gts is crazy

8800gts 320
8800gts 640
8800gts 640 112shaders
8800gts 512
a c 130 U Graphics card
January 18, 2008 6:41:32 AM


What is this trend by reviewers to test in Vista only are MS sponsering them to do so or what ?
No but seriously why when the biggest by far section of the gaming comunity are still runing XP, I still run XP so this review is meaningless to me and if the performance of those graphs stacks up to what you guys running Vista are getting then its gonna be a cold day in hell before i get it.
Crysis no AA no AF 1280x1024 14 fps on a 3870 and 17.12 on a 8800GT !! Crist i can get that in XP running a X1650XT.
No wonder people are doing there nut every time a card comes out that dosent give better performance.
When i first read this thread i was going to ask what this obsession with "maxing out" new games is, Oblivion was mentioned and yes it was and is a monster of a game but if you could run every game released at max the graphics boys would all be out of a job. As has been said before on this forum some people need to learn how to "move the sliders to the left" but that was before i took a look at the review and initially i thought the review was false based on the FPS but saw it was tested in vista and WOW as i said if thats what you guys running Vista really get..... im speachless.
Mactronix
January 18, 2008 1:25:09 PM

mactronix said:
Crysis no AA no AF 1280x1024 14 fps on a 3870 and 17.12 on a 8800GT !! Crist i can get that in XP running a X1650XT.

It is possible to run Crysis in DX9 mode under Vista, which yields almost identical performance to XP. That's what I do.
!