Sign in with
Sign up | Sign in
Your question

The AMD Squeeze: Nvidia Intros GeForce 8800 for $300

Last response: in Graphics & Displays
Share
February 12, 2007 4:23:20 PM

With nothing coming out of the newly merged AMD and ATI, Nvidia pushes out another product to take the sweet spot in the discrete graphics market.
February 12, 2007 4:57:22 PM

The sweetest spot AYE for us Price vs Performance Whizzes XD
February 12, 2007 5:06:03 PM

Quote:
The card could not handle the hardest resolutions and is weak at resolutions above 1600x1200 with the image quality settings turned up. That being said, we would not expect those with massive CRTs or 30" displays purchasing this card.


I don't understand the second sentence.

I have a display that does 1920x1080 or 2073600 pixels
while 1600 x 1200 is 1920000.

So would this card not be recommended for my display? It would be better to spend an extra $100 for this?


or are we only talking about bragging rights and that the cheaper card will be good enough for the next few years.
Related resources
February 12, 2007 5:58:37 PM

For the price, this is the best card, hands down. He's just saying this 8800GTS suffers from less memory when the eye candy gets turned on. The 30" displays referred to are probably REALLY hi-def, around 2500x1600. I think this card would be great for you monitor, unless you want to pay $400 for a regular GTS.

Gentlemen, I believe the X1950XT killer is here :twisted: .
February 12, 2007 6:21:41 PM

After seeing these results, am I the only one who's dying to see what a 384-512Mb 8800 GTX (at around $450) could do?
February 12, 2007 6:38:53 PM

pretty certain there wont be any 512 cards this generation they have gone with funny ram sizes. they cant justswitch back.
February 12, 2007 6:52:48 PM

Hey Great article Tom's! I was interested to note that your benchmarks disagree with two others on Oblivion. Two other sites I read reviews at both indicated that Oblivion was one game that didn't mind the reduced memory.

By the way, does anybody have an idea what the best drivers for this card would be?
February 12, 2007 7:25:05 PM

and seeing as how Nvidia support under Vista blows at the time...pushing out more DX10 counterparts is really helping them
February 12, 2007 7:27:36 PM

Prolfe - the best drivers are Nvidia-based drivers....

*nods*


When it comes to BFG - stick with their drivers.
a b U Graphics card
a b Î Nvidia
February 13, 2007 2:01:20 AM

Quote:
With nothing coming out of the newly merged AMD and ATI, Nvidia pushes out another product to take the sweet spot in the discrete graphics market.


I would like to know the Oblivion test settings, because saying 'we maximize the settings' doesn't reveal much. With or without HDR?
Most other sites I've seen haven't used HDR, and of course that would have an impact.

The TechReport shows another few weaknesses of the memory in Oblivion (without HDR) and GRAW where the positions change, allowing the X1950XTX to go ahead;

http://techreport.com/reviews/2007q1/geforce-8800gts-32...

HL2, at highest resolution (2560x1600) the 256mb X1900XT pulls ahead;
http://techreport.com/reviews/2007q1/geforce-8800gts-32...

and even in FEAR where the X1900XT-256 pulls ahead;

http://techreport.com/reviews/2007q1/geforce-8800gts-32...

Interesting the back and forth in FEAR where at the highest resolution the X1900XT-256 has a higher avg, but the GTS has a better 'low avg' meaning maybe less dramatic drops.
a b U Graphics card
February 13, 2007 2:21:38 AM

Quote:
forgive my ignorance but why does the memory affect how it plays at higher res since the x1950 ultimate only has 256MB. can someone explain what it is about theis GTS that causes it to fail but the x1950 not to?

I was going to say that too. It doesnt make sense that the GTS has more memory and yet they said it didnt have enough to complete the tests, and the x1950 pro only has 256mb and it can complete the test.

EDIT: Also did anyone notice that the x1950 pro scored exactly the same in both FEAR tests? Did they really test this card or pull results out of the air?
February 13, 2007 3:43:16 AM

Quote:

EDIT: Also did anyone notice that the x1950 pro scored exactly the same in both FEAR tests? Did they really test this card or pull results out of the air?


That's why it's the "Ultimate" edition. It gets those FPS results no matter what. They're like Hurley's freaky Numbers... they'll haunt the owner of that card till the day he.... buys a new card... erm...
a b U Graphics card
a b Î Nvidia
February 13, 2007 4:42:43 AM

Quote:

The 8600 ultra is supposed to be a 512mb card and the 8600gt a 256mb


Yeah but the GF8600 series has a normal/commonplace bitwidth of 128bit or 256bit so 256MB and 512MB isn't a big deal; while the GF8800 series has a 320bit or 384bit, so a 512MB models isn't possible without either messing with the memory chip allotment or changing the way the memory interacts with the core, both of which defeat the purpose of 384 vs 512bit.
February 13, 2007 10:56:10 AM

What I'd like to know is whether the power requirements for the 320MB version of the GTS are more reasonable than its two larger brothers. I don't think it is right that every GPU purchase these days should require a simultaneous PSU size increase.
February 13, 2007 11:22:03 AM

if you dont play at insane resolutions the 320mb version is the best card for the price out there, still nvidia has to make good rivers for that one and all the others :lol:  , it has no sense that the x1900xt with 256 mb performs better than de 8800 320mb just if we think that the performance of 8800 is cut by memory size at high resolutions ! are better 256 mb than 320? with a wider bus! haha
a b U Graphics card
a b Î Nvidia
February 13, 2007 4:14:15 PM

Well the power requirement could be less, but not by much (consider it like 5-10W at best under load). It also depends alot on whether you get the Overclocked versions or not. The Tech Report's numbers show pretty much parity with the 640MB model, with the OC'ed 320XXX consuming significantly more power (21W);

http://techreport.com/reviews/2007q1/geforce-8800gts-32...

The GTX consumes a little more due to more memory, more functional core parts and more wire traces to the memory, but the GTS is not going to save much just by reducing the memory chips to smaller amounts, heck you'd get more savings going to GDDR4.

A quality PSU from the past year should be just as fine with this as it would've been with a single GF7800GTX or X1950XT. So considering the performance boost that's pretty good, and why shouldn't you have to get a new PSU if you need it? No one forces anyone to buy a high end card, power consumption is a concern for the middle-class not for these cards which even when crippled waste about as much energy as their full power brothers.
February 13, 2007 11:53:15 PM

I guess your right. I seem to have fished a 8800 GTS from EVGA with 640MB. Struggling with drivers, hope the core dumps help. Nice thing I have this setup on Ultra SCSI 3, 15000 RPM, 4ms avg seek. Wrote the dump before the blue screen. ;)  8O
February 14, 2007 12:12:29 AM

It's been awhile since we've seen a competitive product from either part of AMD.
February 14, 2007 5:33:47 PM

Quote:
and why shouldn't you have to get a new PSU if you need it?


I can think of a couple of reasons: -

1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.

20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!

2) Because there’s an environmental catastrophe bearing down on all of us, fast! But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.

Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.


Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.
February 14, 2007 5:51:10 PM

Quote:
and why shouldn't you have to get a new PSU if you need it?


I can think of a couple of reasons: -

1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.

20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!

2) Because there’s an environmental catastrophe bearing down on all of us, fast! But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.

Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.


Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.

And who, precisely, is holding the gun to your head and forcing you to go SLI. You don't have to buy the latest and greatest. ATI and Nvidia are trying to produce cards with the speed that the consumer demands. The consumer always wants faster. Unfortunately, they're limited by this little thing called physics. More speed = more power.

So perhaps the responsibility for all that power use, and for the psu landfill should lie with the consumer. After all, if we didn't buy them, they wouldn't keep making them.

In any case, you should think about taking SOME of the responsibility for your actions, you can't always just blame it on the big companies.

Besides you could do a single GTS just fine, and just about quadruple the performance of your 6800GT.
a b U Graphics card
a b Î Nvidia
February 14, 2007 9:47:34 PM

Quote:

1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.


That's not a reason, that's a hope or assumption. And SLi, if you have an SLi PSU then you already have enough juice for one card. If you're SLi'ing a GTS-320 instead of buying a single GTX then you have other issues. If you wanna play with the Big Boys you gotta pay the price of upgrading ALL THE TIME. Like was mentioned no one is forcing you to uprade, and no game outpaces any current solution that fits within that power envelope (heck nothing is unplayable on an X1950P let alone any old high end SLi).

Quote:
20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!


Flabergasted !?! HOW could you be?
We've be talking about self contained nuclear generators in rigs since the FX5900/R9800Pro-256DDRII era. Always expect that part of the demand for Mo' Powah, will be the need for.... you guessed it, more power.

Quote:
2) Because there’s an environmental catastrophe bearing down on all of us, fast!


Puh-lease, that's just Ridiculous!!

You cannot put the desire for SLi and then concern for the environment in the same FAQin post!
Get a damn integrated solution if you care more about the environment, not SLi !! :roll:

Quote:
But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.


What you're saying makes no sense, there is a market for that, it's not the enthusiast or gamer market. And if you wanna be anal about it, the Watt per op goes down from high end to high end, but you can't appreciate it because you wonder why the Teraflop PCs are consuming more energy than a Vic20 or Apple ][, not appreciating that they consuming relatively less for their functionality.

Quote:
Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.


Yeah I know, every time you replace a PSU an Angel Loses their wings. :roll:
PSUs are far from the biggest issue. :?

Quote:
Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.


Then get a laptop or integrated, just stop asking why your M1A1 tank isn't as efficient as a Prius !! :evil: 
February 17, 2007 5:56:44 AM

Does anybody know if/when AMD/ATI are planning on adding CUDA like capabilities to their cards, appart from that CPU-GPU merging of theirs :) 
February 17, 2007 6:01:46 AM

One other thing if anyone knows anything about: Can the computation abilities of CUDA able cards be partitioned somehow, like half of a card for physics and half for graphics or similar? or should one go for +1 cards by the time. I know this is an anti high-end dual/quad sli point of view, but don't flame me for that. Thanx.
February 17, 2007 2:19:05 PM

can anyone tell me that if i'm using 17" or 19" (res 1024 x 768 or less than 1600 x 1200), will 8800gts with 320mb is good enough to play games? or 8800 gts 640mb/ 8800gtx will get better result?
February 17, 2007 3:11:47 PM

didn't means any specific games, just the average performance? btw, for command and conquer 3: triberium wars, will 8800 gts 320mb play well with those 4X antialiasing enabled? (in resolution 1280x 1024 or 1024 x 768)

(sorry for my bad english :p )
February 17, 2007 3:17:31 PM

hmm...so how about if compare the 8800gts 320mb to 7950gtx on FEAR, quake 4, 2142, and ohers? 8800gts 320mb still get the better result?

i had read many comparison for resolution 1600 x 1280 and more than that but i can't find any information for res below 1600x1280.. :( 
a b U Graphics card
a b Î Nvidia
February 18, 2007 7:56:56 AM

FS's review shows 1280x1024 (with 4XAA when it can handle it);

http://www.firingsquad.com/hardware/evga_e-geforce_8800...
http://www.firingsquad.com/hardware/nvidia_geforce_8800...

You may have difficulty finding the GF7950GTX as it's not a normal retail combo (it's mainly GFGO 7950GTX), but the 7950GT and the 7900GTX are in those reviews, so check the specs of the one somone is selling as a GF7950GTX and figure out where you stand.
February 19, 2007 3:02:35 AM

hmm...ic... i will take a look in those 2 article.. thanks :) 
March 15, 2007 7:37:19 PM

My question is this:

If you were to SLI two 320's together, would that eliminate any problems that are coming from not using a 640m card?
March 27, 2007 12:22:21 PM

Well, I have to admit I am a little confused by this report, since one part of it is either untrue or doesnt apply due to my hardware.

I went back to play Oblivion (originally played with an AMD 3500 Athlon and GF6800GS) once I had my new card - An EVGA GF8800GTS. I ramped up everything to max, full AA, full grass, shadows, 1600x1200, you name it; and it ran like a freaking DREAM at about 110 fps.
No crashes, no blips, no heat!

Whats the story??? ( and NO I am not LUCKY lol)

AMD 4800+ Dual Core
2 Gb DDRII 800Mhz
2 x 200 GB Raptors
EVGA GF8800GTS
M2N32 Deluxe "maim"board
Sound Blaster Audigy SE
a b U Graphics card
a b Î Nvidia
March 27, 2007 3:53:16 PM

Quote:

If you were to SLI two 320's together, would that eliminate any problems that are coming from not using a 640m card?


No, SLi does not add memory it simply copies the information into both small chunks, so the areas where 640MB would be an advantage are still there, the only wuestion becomes whether the now more powerful core combination is more of an influence than the memory buffer.
March 27, 2007 4:00:12 PM

Thank you.

Since that time I posted, I purchased an 8800 GTS W/640. I'll get another one in about 4 months.
a b U Graphics card
a b Î Nvidia
March 27, 2007 4:01:08 PM

Quote:

I went back to play Oblivion (originally played with an AMD 3500 Athlon and GF6800GS) once I had my new card - An EVGA GF8800GTS. I ramped up everything to max, full AA, full grass, shadows, 1600x1200, you name it; and it ran like a freaking DREAM at about 110 fps.


Not possible. Maybe you hit areas where you had 110fps, but there is no way that was anywhere near your average when you were outside, especially in the heavy foliage areas.
March 28, 2007 4:47:17 PM

Quote:
Well, I have to admit I am a little confused by this report, since one part of it is either untrue or doesnt apply due to my hardware.

I went back to play Oblivion (originally played with an AMD 3500 Athlon and GF6800GS) once I had my new card - An EVGA GF8800GTS. I ramped up everything to max, full AA, full grass, shadows, 1600x1200, you name it; and it ran like a freaking DREAM at about 110 fps in the sewers.
No crashes, no blips, no heat!

Whats the story??? ( and NO I am not LUCKY lol)

AMD 4800+ Dual Core
2 Gb DDRII 800Mhz
2 x 200 GB Raptors
EVGA GF8800GTS
M2N32 Deluxe "maim"board
Sound Blaster Audigy SE


corrected

EDIT: just noticed he is taking the piss. for one he has a AMD 4800 which i assume is 939 as he upgraded a 3500 running DDR2 and also has 2 x 200GB raptors which don't exist.

Actually we are all wrong. My HDD are Caviars ? and its a 4600 dual core AMD, so sorry for the misleading quotes of hardware, but it still remains that I am getting 100+outdoors with all foliage on.

Tbh I don't care what dickheads like some of you think, I can think of better ways of challenging posts then calling my a liar or saying what I read on the FPS meter is wrong. I know what I saw so basically grow up you silly little kids.

The facts have been reported. I love seeing people make TWATS of themselves by being mean, it makes my day :D 
a b U Graphics card
a b Î Nvidia
March 28, 2007 5:44:15 PM

Post a Fraps run then.

Based on everyone else's runs, even RobX2's SLi and non-SLied GTXs, I doubt he'd believe the consistantly above 110+fps in foliage.

You haven't reported facts, just your opinion.
Ours is that your opinion lacks supporting evidnce/facts and goes contrary to everyone else's reports sofar. So either your 'everything to max full AA' differs from others, or you're posting high points not averages, or your test areas aren't that stressful.

Now a fact would be [H]'s run with everything cranked and AA less than max, getting nowhere near 110+fps, let alone averaging that, and with a Superclocked GTS;
http://enthusiast.hardocp.com/article.html?art=MTI5Mywz...

So TBH, considering your other error, I doubt your figures based on your inability to actually provide solid information on something you should know (CPU model and HDD names are provided to you, and can be referenced in your system). So your perception of 110fps or proper setting is in question more than ever.
March 28, 2007 6:23:55 PM

Quote:
It's been awhile since we've seen a competitive product from either part of AMD.

So much truth contained in such a short post.
March 28, 2007 10:30:39 PM

Quote:
please post some screen shots.

i know someone who with an overclocked 4800 and one gtx at 16 x 10 res got around 50FPS outdoors. considering that is a lower res than yours i really doubt your claims.

i reckon you have no clue what you are talking about.


No probably not since I am a trained PC World Technician and we all know how bad they are.

Still, like I said, it really doesnt matter what you think, I'm having a great game thanks!

If you want screenshots, then go buy the card and find out. I posted here just to tell you what Ive seen, to be ridiculed like this just isnt worth it.

Mayber in the great forest I will start to see it, but Darkstar Tower is enough to lower my framerate to 60.

Are you running Francescos or Oscuro's mod?
!