8800GTX or not?

gilamran

Distinguished
Jan 7, 2007
106
0
18,680
Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...

BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...
at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps


So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...


Thanks
Gil
 

Dahak

Distinguished
Mar 26, 2006
1,267
0
19,290
Because then you can say you have the current best video card for gaming in the world.And if money is no issue,then this purchase should be just perfect.Not to mention you will future proof your comp for any games that come out over the next 2-3 years.That is provided you have high end hardware such as cpu and psu to go along with it.Goodluck.

Dahak

AMD X2-4400+@2.533 S-939
EVGA NF4 SLI MB
2X EVGA 7800GT IN SLI
2X1GIG DDR IN DC MODE
WD300GIG HD
EXTREME 19IN.MONITOR 1280X1024
ACE 520WATT PSU
COOLERMASTER MINI R120
 

djgandy

Distinguished
Jul 14, 2006
661
0
18,980
Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...

BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...
at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps


So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...


Thanks
Gil

Oh god not one of these! :lol:

Are you honestly telling me you can't tell the difference between a game at 30fps and a game at 60fps and a game at 120fps?

Also read this :)
http://www.100fps.com/how_many_frames_can_humans_see.htm
 

Bloated

Distinguished
Nov 5, 2006
89
0
18,630
30 is the minimum but it depends on the level of detail and speed of the scene as well as the environment.

movies project at 24fps and they look like shit in general, now throw in HD content and 24fps is in pretty bad shape by any meaure....

when I compare video card I look at 30fps as a minimum playable setting in general... that is if the graph tracks the FPS like HardOCP's generally doese then I try to assess the amount of time the video card spends below 30fps..... anything higher is gravy the higher the better.

in the case of the 8800 GTX I wouldn't buy one yet for a number of reasons but very few of those are because it's not performing fast enough.

8800's currently have poor driver support, and are overpriced with refreshes coming in a month or so from ATI and likely one from Nvidia I don't see much value in the 8800 GTX atm.

that said todays benchmarks show the 8800 GTX rendering 210fps in a few applications but Oblivion still brings it to it's knees as will any and all future applications..... so really if you plan on keeping your hardware for any length of time factoring in concerns for longevity are valid it's a given time isn't travelling backwards and what runs uber fast today is tomorrow's dog..... when it becomes a dog is the concern.
 

tenstorey

Distinguished
Jun 9, 2006
16
0
18,510
Simple answer. Evolution. Read the system requirements on the back of your pc games. Does it stay the same through the years?

I rest my case.
 

scrag_meister

Distinguished
Jan 10, 2007
21
0
18,510
LOL, Some people.

Dell 2407, Dell 3007, Westinghouse 37, you need to read more about available resolution.

anti-aliasing, ansitropic filtering.

If you're telling me you can't tell the difference, between doom3 at 1024x768 no AA or AF and Oblivion running on a 8XAA, 16xAF 8800 GTX and a Dell 3007 then you probably need an optician more than a new graphics card.

1 tip though, if you're gonna go 8800 GTX make sure you have the CPU to supply it, cos your DX266 isn't going to cut the mustard. C2D or High AMD.

Whatever you do, have fun.
 

djgandy

Distinguished
Jul 14, 2006
661
0
18,980
LOL, Some people.

Dell 2407, Dell 3007, Westinghouse 37, you need to read more about available resolution.

anti-aliasing, ansitropic filtering.

If you're telling me you can't tell the difference, between doom3 at 1024x768 no AA or AF and Oblivion running on a 8XAA, 16xAF 8800 GTX and a Dell 3007 then you probably need an optician more than a new graphics card.

1 tip though, if you're gonna go 8800 GTX make sure you have the CPU to supply it, cos your DX266 isn't going to cut the mustard. C2D or High AMD.

Whatever you do, have fun.

If you have that screen for playing games then money is no object anyway.
 

slavadon

Distinguished
Jan 6, 2007
55
0
18,630
that said todays benchmarks show the 8800 GTX rendering 210fps in a few applications but Oblivion still brings it to it's knees as will any and all future applications..... so really if you plan on keeping your hardware for any length of time factoring in concerns for longevity are valid it's a given time isn't travelling backwards and what runs uber fast today is tomorrow's dog..... when it becomes a dog is the concern.

You're making the assumption that benchmarks of Oblivion can be directly related to DX10 games. When you make an assumption do you know what happens? You make an ass out of u and mption. Mostly you're just making an ass out of yourself though. From everything I've read, performance of Obliv in high res has no correlation to performance of future DX10 games because of the entirely different software DX10 utilizes. This is totally ignoring the fact that Oblivion in any reasonable resolution runs excellently on the 8800 GTX.

If you're going to spend 400 dollars on an X 1950 XTX or whatever, spend the extra 150 and get DX 10 support and a card that out performs all the DX9 cards anyway. Your card will probably last a year longer because of it. After the price has dropped you could grab a second one if you have the right mobo, cooling, and a sufficient CPU, which could potentially further delay your need to upgrade to the newest ridiculously over priced GPU in the future.
 

mjam

Distinguished
Sep 10, 2006
73
0
18,630
I have a Dell 30" LCD Monitor (3007FPW) coming early next week; it's NATIVE resolution is 2560 X 1600. Requires a dual-link DVI (not to be confused with dual DVI). My current video card (X850XT) isn't dual-link. Unfortunately, it is AGP and I don't believe there are any AGP dual-links out there. This is going to drive a complete system upgrade...I was going to wait until R600 series came out but I may have to jump the gun and opt for a 8800 GTS. Will consider the GTX only if the deal is right.

I'm not a gamer but I do some CAD and graphics work, plus I like "screen real estate". May have overdone it this time...
 
DX10 card outperforms all DX9 ones: That's bull, when system requirements are high enough to make a single 8800 not powerful enough, adding a second one in SLI won't help:
- you will have a hard time finding an identical card
- a DX10 card isn't faster than a DX9 one; what DX10 brings is easier shader programming, thus more intricate effects, not more performance on existing effects (you get identical motion blur in DX5 than in DX9, but with different code, yet performance on the scene is pretty much the same on the same hardware). This used to be true on water effects between DX8.0 and DX 8.1, but not on advanced programmable shaders.

Right now, the 1950XTX leads in Oblivion while the 8800 GT scores well, but no app has as of yet made use of DX10

A good comparison would be to check against OpenGL performances: at least the language itself doesn't change much, and it would allow the 1950XTX's and 8800GTX unified shaders to be really compared one to the other.
 

speedemon

Distinguished
Mar 29, 2006
200
0
18,680
Right now, the 1950XTX leads in Oblivion while the 8800 GT scores well, but no app has as of yet made use of DX10

A good comparison would be to check against OpenGL performances: at least the language itself doesn't change much, and it would allow the 1950XTX's and 8800GTX unified shaders to be really compared one to the other.[/quote]

8O what the **** are u smokin???
 

barlag

Distinguished
Feb 27, 2006
26
0
18,530
So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...
Thanks
Gil

This isn't true, a human brain can see up to 100 FPS. On the TV you see 25FPS PAL and 30FPS NTSC but that is smooth only because of motion blurring. Its achieved by exposing the film to light 1/25 of a second for each frame. If someone runs on the film it gets blurred on the frames. On the PC there is no motion blurring and when you turn quickly in First person shooter 30FPS looks quite jerky. I used to play Quake 2-3 a lot in the old days and it was quite obvious that 80FPS and above was the only way to go :)

You can test this theory by setting your refresh rate to 65Hz (this doest apply to LCDs) and look above the monitor, you should see the flicker quite considerably. Put it to 80, some people can still see it, put it to a 100hz it should be steady.

Thats why 8800gtx is a very good idea if you can afford one. I think prices are going to drop drastically within the next 4 months and ATIAMD should have a counter part out soon which will probably be as good as a GTX.
Gabor
 

slavadon

Distinguished
Jan 6, 2007
55
0
18,630
Right now, the 1950XTX leads in Oblivion while the 8800 GT scores well, but no app has as of yet made use of DX10

A good comparison would be to check against OpenGL performances: at least the language itself doesn't change much, and it would allow the 1950XTX's and 8800GTX unified shaders to be really compared one to the other.

You sound like you probably know more about this than me, but all the benchmarks I've seen say the 8800 easily outperforms a single X 1950 XTX. The only thing that ever seems to outperform a single 8800 GTX is 2 X 1950 XTXs in a CF setup. If you have some that state otherwise I'd really like to see them.
 
Well, I checked Tom's VGA charts; Oblivion, 1280x1024, no AA, 8x AF: the 1950 XTX leads, followed by the 1900XTX, and the 8800GTX right behind.

The 8800 seems to lead when there is a small resolution and no AF, or when the resolution reaches 1600x1200.

I'd say the performance difference between the two is kind of incidental.
 

purplerat

Distinguished
Jul 18, 2006
1,519
0
19,810
DX10 card outperforms all DX9 ones: That's bull, when system requirements are high enough to make a single 8800 not powerful enough, adding a second one in SLI won't help:
- you will have a hard time finding an identical card
- a DX10 card isn't faster than a DX9 one; what DX10 brings is easier shader programming, thus more intricate effects, not more performance on existing effects (you get identical motion blur in DX5 than in DX9, but with different code, yet performance on the scene is pretty much the same on the same hardware). This used to be true on water effects between DX8.0 and DX 8.1, but not on advanced programmable shaders.

Right now, the 1950XTX leads in Oblivion while the 8800 GT scores well, but no app has as of yet made use of DX10
Completely wrong! DX10 cards ARE faster then DX9. Both the 8800GTS and 8800GTX (there is no 8800GT as you claim) are faster then any DX9 card and by quite a bit. Like the other guy said the Oblivion marks are Crossfire (2 cards in case you didn't know) against a single 8800 and still the 2 1950XTXs only beat the 8800GTX by 3 FPS. Also look a little deeper at the 1600x1200 no AA, 8x AF, both 8800's completely blow away any other single card and all Crossfire setups.

No 100FPS isn't noticable but higher framerates do make a difference. While the human eye can only see ~30FPS it's still a good idea to have your games playing far above that if you want optimal gameplay. A game can easily average 40 - 50 FPS and still occasionally fall below 30FPS and cause choppy gameplay. Oblivion used to do this to be all the time with my 7900GT Sli setup. With my 8800 it very rarely happens even with higher settings then the 7900's. No I cant see the 60FPS the game usually runs at, but what I also don't see is any lag or choppyness when my frame rate drops by 20FPS because there's 8 characters battling outdoors.
 

John_C

Distinguished
Dec 30, 2006
150
0
18,680
You can test this theory by setting your refresh rate to 65Hz (this doest apply to LCDs) and look above the monitor, you should see the flicker quite considerably. Put it to 80, some people can still see it, put it to a 100hz it should be steady.

Not that it is relevant to this discussion, but the higher refresh rates on CRTs are more about preventing eye strain due to the lighting levels fading after each phosphor sweep.

I think you touched on an important point though. Aren't most of the high resolutions being run on large LCD panels? And aren't those LCD panels being run at a refresh of 60 Hz? And if you are only refreshing the monitor 60 times per second, wouldn't a frame rate above that be wasted?

That said, games continue to push the envelope with more and more detail requiring higher and higher polygon counts. Buying a high end card helps ensure it will support the higher end games of the future. And with the improved instancing in DX10, I think we will see total polygon counts increase dramatically over the next few years and these new cards will really shine.
 
refresh rates is indeed less relevant with LCDs due to the lack of need for screen refresh; it is right now used as a practical way to pack data together while still maintaining compatibility with progressive material (films).

Personally I'm sensitive to refresh rates on CRT that are under 85 Hz - an LCD screen is useful.

@purplerat: as I said, forget what I said about the X1950XTX leading in Oblivion. I'd like to compare the 2 cards on a clock for clock, shading unit per shading unit basis: the 8800 with its 128 unified shaders at 1.35 GHz vs. the X1950XTX and its 48 shaders clocked at 650 MHz, the X1950XTX looks quite puny there.
 

Shadowsniperx

Distinguished
Jan 17, 2007
2
0
18,510
Well, I checked Tom's VGA charts; Oblivion, 1280x1024, no AA, 8x AF: the 1950 XTX leads, followed by the 1900XTX, and the 8800GTX right behind.

The 8800 seems to lead when there is a small resolution and no AF, or when the resolution reaches 1600x1200.

I'd say the performance difference between the two is kind of incidental.


Yes you a quite correct, and to the guy who said they are Crossfire setups, no they are not, the CF at the end of its name means the card is the Crossfire edition. The 1950XTX and 1950XTXCF will give the same results, the CF card can be coupled with another CF card or coupled with an 1950XTX to create a Crossfire configuration.

1950XTXs do still out perform the 8800GTS on 99% of games to date and will for some time on DX9 based platforms.

On the other hand the 8800GTX which has a higher processing spec run comparitively well against its faster competitor the 1950XTX on DX9 platforms, but at what price *boggles*

When it comes to DX10 though, well naturally the 8800GTX will rule as the 1950XTX does not support DX10 based games so it really can not be compared.

We shall have to wait for the release of the new high end ATi chip that supports DX10 to make a comparison. All in all if your going to wait, I beleive you will once again find ATi's DX10 chipset out performing the Nvidia 8800 GTX, along with being released with DDR4 as standard.

Shadowsniper