The AMD Squeeze: Nvidia Intros GeForce 8800 for $300

bgerber

Distinguished
Feb 10, 2006
194
0
18,680
With nothing coming out of the newly merged AMD and ATI, Nvidia pushes out another product to take the sweet spot in the discrete graphics market.
 

predaking

Distinguished
Jan 28, 2007
128
0
18,680
The card could not handle the hardest resolutions and is weak at resolutions above 1600x1200 with the image quality settings turned up. That being said, we would not expect those with massive CRTs or 30" displays purchasing this card.

I don't understand the second sentence.

I have a display that does 1920x1080 or 2073600 pixels
while 1600 x 1200 is 1920000.

So would this card not be recommended for my display? It would be better to spend an extra $100 for this?


or are we only talking about bragging rights and that the cheaper card will be good enough for the next few years.
 

djplanet

Distinguished
Aug 27, 2006
489
0
18,780
For the price, this is the best card, hands down. He's just saying this 8800GTS suffers from less memory when the eye candy gets turned on. The 30" displays referred to are probably REALLY hi-def, around 2500x1600. I think this card would be great for you monitor, unless you want to pay $400 for a regular GTS.

Gentlemen, I believe the X1950XT killer is here :twisted: .
 

Sandwich

Distinguished
Feb 12, 2007
2
0
18,510
After seeing these results, am I the only one who's dying to see what a 384-512Mb 8800 GTX (at around $450) could do?
 

NoNeeD

Distinguished
Jan 17, 2007
32
0
18,530
pretty certain there wont be any 512 cards this generation they have gone with funny ram sizes. they cant justswitch back.
 

prolfe

Distinguished
Jan 9, 2005
252
0
18,780
Hey Great article Tom's! I was interested to note that your benchmarks disagree with two others on Oblivion. Two other sites I read reviews at both indicated that Oblivion was one game that didn't mind the reduced memory.

By the way, does anybody have an idea what the best drivers for this card would be?
 
With nothing coming out of the newly merged AMD and ATI, Nvidia pushes out another product to take the sweet spot in the discrete graphics market.

I would like to know the Oblivion test settings, because saying 'we maximize the settings' doesn't reveal much. With or without HDR?
Most other sites I've seen haven't used HDR, and of course that would have an impact.

The TechReport shows another few weaknesses of the memory in Oblivion (without HDR) and GRAW where the positions change, allowing the X1950XTX to go ahead;

http://techreport.com/reviews/2007q1/geforce-8800gts-320mb/index.x?pg=7

HL2, at highest resolution (2560x1600) the 256mb X1900XT pulls ahead;
http://techreport.com/reviews/2007q1/geforce-8800gts-320mb/index.x?pg=6

and even in FEAR where the X1900XT-256 pulls ahead;

http://techreport.com/reviews/2007q1/geforce-8800gts-320mb/index.x?pg=5

Interesting the back and forth in FEAR where at the highest resolution the X1900XT-256 has a higher avg, but the GTS has a better 'low avg' meaning maybe less dramatic drops.
 

randomizer

Champion
Moderator
forgive my ignorance but why does the memory affect how it plays at higher res since the x1950 ultimate only has 256MB. can someone explain what it is about theis GTS that causes it to fail but the x1950 not to?
I was going to say that too. It doesnt make sense that the GTS has more memory and yet they said it didnt have enough to complete the tests, and the x1950 pro only has 256mb and it can complete the test.

EDIT: Also did anyone notice that the x1950 pro scored exactly the same in both FEAR tests? Did they really test this card or pull results out of the air?
 

Sandwich

Distinguished
Feb 12, 2007
2
0
18,510
EDIT: Also did anyone notice that the x1950 pro scored exactly the same in both FEAR tests? Did they really test this card or pull results out of the air?

That's why it's the "Ultimate" edition. It gets those FPS results no matter what. They're like Hurley's freaky Numbers... they'll haunt the owner of that card till the day he.... buys a new card... erm...
 
The 8600 ultra is supposed to be a 512mb card and the 8600gt a 256mb

Yeah but the GF8600 series has a normal/commonplace bitwidth of 128bit or 256bit so 256MB and 512MB isn't a big deal; while the GF8800 series has a 320bit or 384bit, so a 512MB models isn't possible without either messing with the memory chip allotment or changing the way the memory interacts with the core, both of which defeat the purpose of 384 vs 512bit.
 

Ciderdude

Distinguished
Feb 13, 2007
3
0
18,510
What I'd like to know is whether the power requirements for the 320MB version of the GTS are more reasonable than its two larger brothers. I don't think it is right that every GPU purchase these days should require a simultaneous PSU size increase.
 

mayouuu

Distinguished
Jan 24, 2006
124
0
18,680
if you dont play at insane resolutions the 320mb version is the best card for the price out there, still nvidia has to make good rivers for that one and all the others :lol: , it has no sense that the x1900xt with 256 mb performs better than de 8800 320mb just if we think that the performance of 8800 is cut by memory size at high resolutions ! are better 256 mb than 320? with a wider bus! haha
 
Well the power requirement could be less, but not by much (consider it like 5-10W at best under load). It also depends alot on whether you get the Overclocked versions or not. The Tech Report's numbers show pretty much parity with the 640MB model, with the OC'ed 320XXX consuming significantly more power (21W);

http://techreport.com/reviews/2007q1/geforce-8800gts-320mb/index.x?pg=9

The GTX consumes a little more due to more memory, more functional core parts and more wire traces to the memory, but the GTS is not going to save much just by reducing the memory chips to smaller amounts, heck you'd get more savings going to GDDR4.

A quality PSU from the past year should be just as fine with this as it would've been with a single GF7800GTX or X1950XT. So considering the performance boost that's pretty good, and why shouldn't you have to get a new PSU if you need it? No one forces anyone to buy a high end card, power consumption is a concern for the middle-class not for these cards which even when crippled waste about as much energy as their full power brothers.
 

Pappaous

Distinguished
Feb 14, 2007
16
0
18,510
I guess your right. I seem to have fished a 8800 GTS from EVGA with 640MB. Struggling with drivers, hope the core dumps help. Nice thing I have this setup on Ultra SCSI 3, 15000 RPM, 4ms avg seek. Wrote the dump before the blue screen. ;) 8O
 

Ciderdude

Distinguished
Feb 13, 2007
3
0
18,510
and why shouldn't you have to get a new PSU if you need it?

I can think of a couple of reasons: -

1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.

20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!

2) Because there’s an environmental catastrophe bearing down on all of us, fast! But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.

Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.


Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.
 

TucsonPi

Distinguished
Jan 21, 2007
31
0
18,530
and why shouldn't you have to get a new PSU if you need it?

I can think of a couple of reasons: -

1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.

20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!

2) Because there’s an environmental catastrophe bearing down on all of us, fast! But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.

Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.


Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.

And who, precisely, is holding the gun to your head and forcing you to go SLI. You don't have to buy the latest and greatest. ATI and Nvidia are trying to produce cards with the speed that the consumer demands. The consumer always wants faster. Unfortunately, they're limited by this little thing called physics. More speed = more power.

So perhaps the responsibility for all that power use, and for the psu landfill should lie with the consumer. After all, if we didn't buy them, they wouldn't keep making them.

In any case, you should think about taking SOME of the responsibility for your actions, you can't always just blame it on the big companies.

Besides you could do a single GTS just fine, and just about quadruple the performance of your 6800GT.
 
1) Because I spent an inordinate amount of time less than 2 years ago specifying a PC with a view to doing SLI at some point. So I ended up with an SLI certified motherboard, an SLI certified 600W PSU and an SLI certified GF6800GT. Even then I balked at the need for an expensive 600Watt power supply and I don't feel it was unreasonable to expect some upgradeability with it.

That's not a reason, that's a hope or assumption. And SLi, if you have an SLi PSU then you already have enough juice for one card. If you're SLi'ing a GTS-320 instead of buying a single GTX then you have other issues. If you wanna play with the Big Boys you gotta pay the price of upgrading ALL THE TIME. Like was mentioned no one is forcing you to uprade, and no game outpaces any current solution that fits within that power envelope (heck nothing is unplayable on an X1950P let alone any old high end SLi).

20 months on and with the 6800 dead I am flabbergasted to discover that to do SLI with any current DX10 card I'll need 750+ Watts. In fact the majority of PSUs on the SLI certified list are 1000Watt, there's even an 1100Watt PSU listed!

Flabergasted !?! HOW could you be?
We've be talking about self contained nuclear generators in rigs since the FX5900/R9800Pro-256DDRII era. Always expect that part of the demand for Mo' Powah, will be the need for.... you guessed it, more power.

2) Because there’s an environmental catastrophe bearing down on all of us, fast!

Puh-lease, that's just Ridiculous!!
tomatofaceeb1.gif


You cannot put the desire for SLi and then concern for the environment in the same FAQin post!
Get a damn integrated solution if you care more about the environment, not SLi !! :roll:

But nVidia and ATI, intent as they are on their little speed war, seem blind to this impeding disaster. They really need to rein in the power requirements, constantly ramping it up with every generation of card they produce is highly irresponsible. They have as much of a responsibility as every other industry to clean up their act and should not be creating the need for this type of power requirement.

What you're saying makes no sense, there is a market for that, it's not the enthusiast or gamer market. And if you wanna be anal about it, the Watt per op goes down from high end to high end, but you can't appreciate it because you wonder why the Teraflop PCs are consuming more energy than a Vic20 or Apple ][, not appreciating that they consuming relatively less for their functionality.

Then there's the landfill somewhere, gradually filling up with the previous generation of PSUs that, whilst serviceable, can no longer drive the latest monster graphics cards.

Yeah I know, every time you replace a PSU an Angel Loses their wings. :roll:
PSUs are far from the biggest issue. :?

Well I for one won’t be beefing up my PSU and when I do need a new one I'd like to think I'll be going in the other direction.

Then get a laptop or integrated, just stop asking why your M1A1 tank isn't as efficient as a Prius !! :evil:
 

virtualban

Distinguished
Feb 16, 2007
1,232
0
19,280
One other thing if anyone knows anything about: Can the computation abilities of CUDA able cards be partitioned somehow, like half of a card for physics and half for graphics or similar? or should one go for +1 cards by the time. I know this is an anti high-end dual/quad sli point of view, but don't flame me for that. Thanx.