Sign in with
Sign up | Sign in
Your question

ATI 3870 Crysis Benchmarks

Last response: in Graphics & Displays
Share
November 12, 2007 9:10:43 AM

http://iax-tech.com/video/3870/38701.htm
Very interesting.
And guys dont forget that ati cards allways get ALLOT BETTER after 1 or 2 driver updates
November 12, 2007 9:19:05 AM

Pretty sure by the time were at catalyst 7.14/7.15, the 2900 will be at its maximum potential. Since the 3000 series is a tweaked and die-shrunk r600, it shouldnt be far off optimum about then either.

Hopefully when crysis comes out, the first thorough benchies will be done with the new catalysts (7.11) and the new 3850's & 3870's.

...roll on next week! :) 
November 12, 2007 9:58:47 AM

looks like the GT still owns it though..
Ryan adds
Related resources
November 12, 2007 10:20:04 AM

Wow. New ATI HD3870 rocks. This is good news to X38 crossfire boards.

nVidia leaps some frames to get faster result thats why it looses some quality for performance.
November 12, 2007 10:31:37 AM

lorismarcos said:
http://iax-tech.com/video/3870/38701.htm
Very interesting.
And guys dont forget that ati cards allways get ALLOT BETTER after 1 or 2 driver updates

The 3Dmarks06 are impressive but the crysis demo shows that 3dmarks isn't everything. This was my first thought until I noticed the massive difference in the min FPS. It just can't be over looked as Min FPS 0.40 at frame 109 for the 8800GT and the HD3870 Min FPS 9.13 at frame 1936. I think this is what the conclusion was referring to on the 8800GT not being smooth. The 0.40 FPS Min will be a death sentence in multi player.
November 12, 2007 10:40:53 AM

True, there is lagging in nVidia because it thinks which frames to skip next. lol.

Professionally, thats the reason it skips because it lags somewhere.
November 12, 2007 10:45:59 AM

Well maybe with a driver update the card will improve in performance.Alas,only time will tell.

Dahak

M2N32-SLI DELUXE WE
X2 5600+ STOCK (2.8GHZ)
2X1GIG DDR2 800 IN DC MODE
TOUGHPOWER 850WATT PSU
EVGA 7950 GX2 550/1400
SMILIDON RAIDMAX GAMING CASE
ACER 22IN WS LCD 1680X1050
250GIG HD/320GIG HD
G5 GAMING MOUSE
LOGITECH Z-5500 5.1 SURROUND SYSTEM
500WATS CONTINUOUS,1000 PEAK
WIN XP MCE SP2
November 12, 2007 10:54:25 AM

elbert said:
The 3Dmarks06 are impressive but the crysis demo shows that 3dmarks isn't everything. This was my first thought until I noticed the massive difference in the min FPS. It just can't be over looked as Min FPS 0.40 at frame 109 for the 8800GT and the HD3870 Min FPS 9.13 at frame 1936. I think this is what the conclusion was referring to on the 8800GT not being smooth. The 0.40 FPS Min will be a death sentence in multi player.


I would have thought when playing multiplayer most people will be turning down the shader settings at the very least... at 1280x1024 with low graphics settings over 100fps average will be easily achievable on both cards with no major worries about min FPS. Crysis does SCALE very well, with just about any card you have to sacrifice image quality to get performance at the moment. With multiplayer being a competetive game the image quality will have to take the back seat for a while till hardware catches up with the game.

Neither card is any use for running crysis in multiplayer mode with graphics maxed... The graphics card for doing that will be released in the run up to christmas next year I should think :D  (thats pure guess work not based on any rumour at all.)
a b U Graphics card
November 12, 2007 11:09:01 AM

Do they mention the NV driver version? Is Crysis optimization IQ fixed or not. If not, what if it's turned off? They'd be pretty equal at that point.
November 12, 2007 11:24:41 AM

I wonder how high the clocks can be overclocked. I though I heard that the 2900xt could be pushed pretty good with proper cooling. I'm wondering if the smaller die will result in high over clocks.
November 12, 2007 11:39:36 AM

Check out vr-zones review of the 2900xt. They modded the air cooler and got it up to 880mhz core (850 with standard cooling). They added more volts and chilled it down to like -60 celsuis, got 1ghz out of it.
November 12, 2007 12:14:09 PM

dtq said:
I would have thought when playing multiplayer most people will be turning down the shader settings at the very least... at 1280x1024 with low graphics settings over 100fps average will be easily achievable on both cards with no major worries about min FPS. Crysis does SCALE very well, with just about any card you have to sacrifice image quality to get performance at the moment. With multiplayer being a competetive game the image quality will have to take the back seat for a while till hardware catches up with the game.

Neither card is any use for running crysis in multiplayer mode with graphics maxed... The graphics card for doing that will be released in the run up to christmas next year I should think :D  (thats pure guess work not based on any rumour at all.)

Using a little math even at 100FPS you may see a low of about 2.4FPS where 0.4 times 6. 8800GT averages 16.77 on this test and 16.77 times 6 is 100.62 FPS. I think Nvidia will have some driver work to do here and will most likely come at a cost of both max and average FPS.
November 12, 2007 2:39:24 PM

San Pedro said:
I wonder how high the clocks can be overclocked. I though I heard that the 2900xt could be pushed pretty good with proper cooling. I'm wondering if the smaller die will result in high over clocks.

I'm hoping when you say "proper cooling" you mean the stock, because on the 2900's, it really is awesome (and amazingly loud)
a b U Graphics card
November 12, 2007 9:45:29 PM

http://www.digitimes.com/mobos/a20071112PD204.html

For fun, lets say this is right. Well the clocks are lower on the one tested in the OP's review. And the prices would be very nice for sure. SHould be fun to see once NDA breaks and made more interesting by the near non-existent supply of 8800GT's.
November 12, 2007 11:41:33 PM

pogsnet said:
Wow. New ATI HD3870 rocks. This is good news to X38 crossfire boards.

nVidia leaps some frames to get faster result thats why it looses some quality for performance.

ATI always has better picture quality.
November 13, 2007 12:11:03 AM

Good step for ATI to lower power consumption imo. I bet they have lost a lot of sales to people with mid-range PSUs. Breaking it finaly after years.
November 13, 2007 12:15:04 AM

man... 15 fps.. 20 fps .. 25 ... 30 ... how long before we here 80 - 100 FPS max settings .... think a 9800GTX in SLI will have a chance?
November 13, 2007 12:29:07 AM

aznstriker92 said:
ATI always has better picture quality.


thanks you, i'm not the only one who thinks so (and i'm an nvidia fan).
November 13, 2007 12:42:46 AM

aznstriker92 said:
ATI always has better picture quality.
In the past this was true, but the Geforce 8 series has slightly higher quality anisotropic filtering than the HD 2900 series. It's really about identical at this point in time.
November 13, 2007 1:16:24 AM

Going from a x1950xtx to a 8800gts in the past month. I have to say ATI has better color quality, and overall picture quality by a noticable amount. Even in the past going from a 9800pro to 6800gt ATI had better quality. Of course I won't talk about speed, but the 2xxx and 3xxx close the gap quite a bit. I have yet to those in operation so I can't comment on their color and picture quality.
November 13, 2007 1:43:49 AM

Heyyou27 said:
In the past this was true, but the Geforce 8 series has slightly higher quality anisotropic filtering than the HD 2900 series. It's really about identical at this point in time.

What he said. Since 8800 series nVidia caught up with ATI on quality, but there is a catch also - nVidia likes to tamper with quality to gain more fps, several recent examples - HD decoding and Crysis.
November 13, 2007 1:54:40 AM

elbert said:
Using a little math even at 100FPS you may see a low of about 2.4FPS where 0.4 times 6. 8800GT averages 16.77 on this test and 16.77 times 6 is 100.62 FPS. I think Nvidia will have some driver work to do here and will most likely come at a cost of both max and average FPS.


You must have failed math... :non: 

FPS doesnt scale linerally with quality settings.
November 13, 2007 1:56:19 AM

Heyyou27 said:
In the past this was true, but the Geforce 8 series has slightly higher quality anisotropic filtering than the HD 2900 series. It's really about identical at this point in time.

what I mean is that ATI though it loses in framerates, it has better picture quality and smoother framerates.
Of course I'm not saying that the 8800 is bad though.
a b U Graphics card
November 13, 2007 2:16:29 AM

I don't know about smoother framerates. Both seem to have there hitches in different games. And I agree that NV has been tampering with IQ for FPS, but so has ATI. I have no problem with optimizations if they would bring them to the table and allow them to be turned off. But like texture shimmering in the GF7's (caused firingsquad to not run default driver settings because the default optimizations caused shimmering), these optimizations can effect IQ.

Overall, while I have noticed my X1800XT and X1950XT had better IQ than the 7800GT, I have not noticed (apart from blurry text bug) the 8800GTS being below the Radeons. I would have to compare them side by side I guess to tell. AF / HQAF is great in both IMO.
November 13, 2007 3:08:09 AM

the only ati card i have ever used was the old radeon 8500 pro. i've since owned GeForce 256, Geforce FX, and Geforce 7600gs. And there is no doubt in my mind that the depth and richness of colours on the ati were no comparison to any nvidia card i have owned. The only reason, and i repeat myself, the ONLY reason i buy Nvidia is because of better driver support.
a b U Graphics card
November 13, 2007 3:28:45 AM

I had a 9800 pro 512MB card and now I have a BFG 7950 GT/OC 512MB and the latest drivers.
I think nvidia has come along ways on the road to better picture quality!
As far as FPS if you can sustain 30+ FPS (at the worst.) you're doing fine.
There's no one that can tell the differance between 30fps and 60fps, let alone 100fps!
These are just bragging numbers!
November 13, 2007 4:35:49 AM

Quote
"There's no one that can tell the differance between 30fps and 60fps, let alone 100fps!
These are just bragging numbers!" - Refering to Crysis?

If ur happy to run a game at 30FPS (because you cant tell the differnece) then ud have to be happy to have it dip down to 10fps in intensive screens...thats basicaly a slide show.

When somebody says they want a game to literally run at 100fps they know its just as smooth at 40fps
(there is a difference in 30 - 40 fps)

Its just that if they can have 100fps average then they can still have 40fps in intenisve scenes(hopefully!).

*Random quote*
"Oh I can run crysis at 30fps it runs so well!!"

Im sure that would become a slide show when u fire rocket at a group of 5 or so enemeies.

High frames are frames for reserve. Not for bragging.

a b U Graphics card
November 13, 2007 4:50:28 AM

I run Lost Planet @1440 x 900 (default on my LCD panel.) with everything at mid and 4 x anti-alising and it dips to 25.4 fps at the most intensive battles..don't realy see too much degredation in gameplay though since I have more CPU power than GPU power.
I still manage to control my character quite well!

Aside from that do you know what the FPS is on a tv screen?

Well, it's 30 fps that's it! All your dvds are running at 30 fps 800 x 600 (that's at 60hz scan)
HDTV at 30 fps 1280 x 768 (720p) is the next level up)
HDTV at 60fps 1600 x 1200 or 1080p (if you have 120hz scan.)
Some newer screens have 1920x1080 but still are running the 1080p format
But still they only get 60 fps!
November 13, 2007 5:31:18 AM

25 - 30 fps is fine for watchin video.....

A first person shooter game being rendered in real time at average 30 fps is somewhat un diserable.

30fps is the average.....that means some times its above 30fps and sometimes its below. Once a First person Shooter goes BELOW 30fps the game will have SOME noticable slow down.

Eg animation wont look as smooth. Mouse movments become less responsive.

Hardcore Gamers are more concerned with the minimum frame rate.

Thats what dictakes how playable a game is on a hardware setup.



November 13, 2007 8:24:52 AM

johnnyq1233 said:
I run Lost Planet @1440 x 900 (default on my LCD panel.) with everything at mid and 4 x anti-alising and it dips to 25.4 fps at the most intensive battles..don't realy see too much degredation in gameplay though since I have more CPU power than GPU power.
I still manage to control my character quite well!

Aside from that do you know what the FPS is on a tv screen?

Well, it's 30 fps that's it! All your dvds are running at 30 fps 800 x 600 (that's at 60hz scan)
HDTV at 30 fps 1280 x 768 (720p) is the next level up)
HDTV at 60fps 1600 x 1200 or 1080p (if you have 120hz scan.)
Some newer screens have 1920x1080 but still are running the 1080p format
But still they only get 60 fps!


Regardless of the Hz rating of the screen most movies are shot at 24 frames per second. Regardless of how many frames per second the monitor can display the frame rate played will be determined by the video standard used not by the monitors refresh rate. This is the reason even without different edits UK, US and theater versions of the same movie may be different lengths because its shot at 24fps and played in theatres at that speed but home play speed is determined by video standards Pal and NTSC have different set frame rates that add up to different running times for movies!

HOWEVER the big differences are the ways in which the frames shown are generated and the nature of the frames generated. This is where so many people are so mixed up.

The way a normal movie camera works is to effectively take 24 photos per second, onto film. This is the primary reason why 24 fps is fine for Movies but not for games. Because its 24 "photos" being taken there is an exposure time for each "photo" its this exposure time that leads to a slight amount of motion blur on each frame! This is what allows 24frames per second to be smooth on movies but not for games!

Games dont have motion blur in each frame, they have x number of solid frames rendered per second. It really doesnt take much of an eye to see the difference between 30 and 60 frames per second in a game. The human brain needs far more frames per second to make up for the lack of motion blur in each frame and still appear to be smooth movement for games than it does for movies where the motion blur caused by the photo exposure time "fills in" the smoothness of the motion and tricks the mind to think you are seeing something smooth. If you pause a dvd in an action scene you will notice the image is never razor sharp, you pause a game and there is no blur at all!

You might be able to control you character well enough at 25fps but watch the animation of things moving and see how much smoother animations are with higher frame rates.

The human eye IS easily able to notice differences in frame rates above 24 -30 fps if the frames dont emply motion blurring to "fill in the gaps" When a movies shot at 24 frames per second the length of the exposure effectively means you are compositing maybe 4 frames into one and THAT is what causes the smoothness! Not the fact that the human brain is unable to tell the difference past so many frames per second!
a b U Graphics card
November 13, 2007 10:06:30 AM

The motion blur is not generated by the game on screens, but by the lack of responsiveness on the part of current lcd panels.
A screen with 2 ms response time effectively recreats the exposure effect where the screen is now the "film" and each frame is "developed" on the screen,erased and the next frame developed on the screen and so on. The older crt screens did better for showing a systems performance be it good or bad than the new lcd panels because there was no response time issue.
Crts were by far the fastest screens around! It's just that they were so big and cumbersome that you realy couldn't have a large screen (bigger than 19") on your desktop without having a momoth desk with alot of support to hold the weight of the screen.
As for movie fps, you are correct stating that they are shot at 24fps.
But I never ment to include the film industry in my previous statement just the delivery system. I probably should have used the ps or xbox as an example....my bad.
Anyways, good discussion hope we here more from others....later!
a b U Graphics card
November 13, 2007 10:32:02 AM

Therell never be a 7.14 driver version. A 8.2 yes.
November 13, 2007 11:17:39 AM

God, I hate decisions. If the benchies and pricing is correct, I really have no idea what I am going to go for.

Meh.
November 13, 2007 11:31:24 AM

johnnyq1233 said:
The motion blur is not generated by the game on screens, but by the lack of responsiveness on the part of current lcd panels.
A screen with 2 ms response time effectively recreats the exposure effect where the screen is now the "film" and each frame is "developed" on the screen,erased and the next frame developed on the screen and so on. The older crt screens did better for showing a systems performance be it good or bad than the new lcd panels because there was no response time issue.
Crts were by far the fastest screens around! It's just that they were so big and cumbersome that you realy couldn't have a large screen (bigger than 19") on your desktop without having a momoth desk with alot of support to hold the weight of the screen.
As for movie fps, you are correct stating that they are shot at 24fps.
But I never ment to include the film industry in my previous statement just the delivery system. I probably should have used the ps or xbox as an example....my bad.
Anyways, good discussion hope we here more from others....later!


I wasnt talking about motion blur seen on game screens, I dont game on LCD's :D  I was talking about why frame rates need to be higher in games than in movies for "smooth" animation. Movies get away with slow frame rates because of the presence of motion blur on the frame itself, effectivly compositing several frames at once in a single frame.

I dont get along with LCDs at all for gaming But I thought that LCDs work by only changeing pixels that change rather than redrawing the whole screen completely. I dont see how the LCD monitor can mimick the exposure time of film as it doesnt composite 4 frames into 1 do provide the smoothness necesary. When playing video games I find I notice the "jumping" of position of things like arms and legs as they move fromone point to another from frame to frame, I dont get this with DVD's due to the motion blur as on a DVD things dont "jump" so badly on every fram due to the motion blur filling in the movement from point a - point b. Graphics cards render the scene as a series of pictures where an hard clear image of object a at point a in one picture and at point b in the next with no transitional movement. On a DVD or other "video" format the the object is "stretched by the exposure time between points A and B and thats what gives video its smooth appearance at lower frame rates.

I dont find PS or xbox animation inherently smoother than PC graphics...


November 13, 2007 11:35:33 AM

elbert said:
Using a little math even at 100FPS you may see a low of about 2.4FPS where 0.4 times 6. 8800GT averages 16.77 on this test and 16.77 times 6 is 100.62 FPS. I think Nvidia will have some driver work to do here and will most likely come at a cost of both max and average FPS.



Unfortunately your maths isnt particularly well used in this scenario - heres the reality of the situation at 1280x1024 with all low graphics:-

Frames, Time (ms), Min, Max, Avg
6921, 60000, 97, 165, 115.350

Min frame rate of 97 from 8800GTX should be plenty good enough for multiplayer? I seriously doubt the GT falls down to a min frame rate of 2.4 fps :)  from the time Ive spent playing crysis so far on a 8800GTX I have experienced no such weird frame rate drops...
November 13, 2007 2:01:17 PM

I personaly believe that after a driver update , it will knock the 8800gt. Competition is allways good for us(the consumers)
We all remember how bad the 2900xt was at first but after allot of driver updates it kept improving. Lets hope that the same thing wil happpen here
November 13, 2007 2:08:30 PM

The 3870 is supposed to peak at 105watts! Thats 3 times less power than my 2900XT uses and the same or better performance. For the price I can buy two for what I paid for this OC'd 2900XT and still be using 100watts less under full load. Thats simply huge. This is a no brainer for me. I'm going 3870 ASAP and it will pay for itself in the reduction in my light bill. Plus I get all the goodies I expected to have in this 2900XT...mainly badass UVD!

I could care less if its a few FPS slower than an 8800GT. These drivers are still fairly unoptimized and they keep on maturing and improving results so It's not even a gamble to go ATI at this point.
November 13, 2007 2:30:03 PM

@wingless I agree with your idea.

Nice video competition is back. May I ask about ATI game incompatibility issue? I heard some bad ATI reputation on some games, are they common or just rare cases?
November 13, 2007 2:44:38 PM

aznstriker92 said:
what I mean is that ATI though it loses in framerates, it has better picture quality and smoother framerates.
Of course I'm not saying that the 8800 is bad though.
Those two statements contradict each other. You can not have a lower framerate, and a smoother framerate. The higher the framerate, the more smooth a game or application will appear and feel. Now if you mean that ATI cards generally have a higher minimum framerate, that was also true in the past but has not been seen with the Geforce 8 series vs. the HD 2900 series.
November 13, 2007 3:01:44 PM

JAYDEEJOHN said:
Therell never be a 7.14 driver version. A 8.2 yes.


eh? why not? ...if not how does their numbering scheme go then??


"Nice video competition is back. May I ask about ATI game incompatibility issue? I heard some bad ATI reputation on some games, are they common or just rare cases?"

There were some compatibility issues with the hd2000 series when they were released, but as of catalyst 7.10 - as far as I know - the only compatibility problems concern how well crossfire works. Sometimes it works quite well, a bit poorly or not at all. My own pet theory, but i think i read that somewhere aswell, is that previously all new drivers for the hd2000 cards were focused on improving compatibility and stability rather than performance, but catalyst 7.10 was the first solid attempt to improve performance. So alot is expected of catalyst 7.11, due out very soon with the new radeons.
November 13, 2007 3:52:18 PM

spoonboy said:
So alot is expected of catalyst 7.11, due out very soon with the new radeons.


So have there been rumors that the 38x0's will be shipping with the catalyst 7.11, or just that they'll be here in the next few weeks?
November 13, 2007 3:59:30 PM

Lower frame rates and having smoother graphic rendering should be said. Well it was judged by the expert and it is possible.

http://iax-tech.com/video/3870/38704.htm

Frame rates is not about quality. You may reduce quality to increase frame rates. One reason why nVidia is always on top it reduces quality even without your knowledge just to get the neccesary FPS, incontrast to ATI which produces good quality picture all through.
November 13, 2007 4:24:04 PM

descalada said:
So have there been rumors that the 38x0's will be shipping with the catalyst 7.11, or just that they'll be here in the next few weeks?


No-one really knows, apparently the suffix '.11' stands for the elevnth month, so that'll be this month. I'd say ATI want their new cards judged in the best possible light so the new drivers will be out at least when the cards are. Hopefully before then because crysis full game benchmarks are looming. Plus I'd like something interesting to read at work this week. :) 

ATI seems to have been quiet regarding their new launches coming up, theres alot of leaked this and that flying around but no big statements from them, perhaps its supposed to all be done in a surprising way.

Should think sometime this week well see new cards and drivers out.
November 13, 2007 4:31:27 PM

spoonboy said:
eh? why not? ...if not how does their numbering scheme go then??


"Nice video competition is back. May I ask about ATI game incompatibility issue? I heard some bad ATI reputation on some games, are they common or just rare cases?"

There were some compatibility issues with the hd2000 series when they were released, but as of catalyst 7.10 - as far as I know - the only compatibility problems concern how well crossfire works. Sometimes it works quite well, a bit poorly or not at all. My own pet theory, but i think i read that somewhere aswell, is that previously all new drivers for the hd2000 cards were focused on improving compatibility and stability rather than performance, but catalyst 7.10 was the first solid attempt to improve performance. So alot is expected of catalyst 7.11, due out very soon with the new radeons.


I think it's year first then month 8.2 would be year 2008 month Feburary (2ed month of the year)
November 13, 2007 8:36:20 PM

dtq said:
Unfortunately your maths isnt particularly well used in this scenario - heres the reality of the situation at 1280x1024 with all low graphics:-

Frames, Time (ms), Min, Max, Avg
6921, 60000, 97, 165, 115.350

Min frame rate of 97 from 8800GTX should be plenty good enough for multiplayer? I seriously doubt the GT falls down to a min frame rate of 2.4 fps :)  from the time Ive spent playing crysis so far on a 8800GTX I have experienced no such weird frame rate drops...

Nice but your information particularly doesn't fit the scenario of the 8800GT either. Wider memory bus and more memory. Now where doest the 8800GT suffer that the GTX really shows its muscle? Is the the drivers the same for the 8800GT and the 8800GTX? If so should they be for optimal performance?

2.4FPS would be the worst possible mathematically possible running at an average of 100FPS. Most likely it would not go that low but with this lower change seen at the average of 16.77 FPS it could easily dip below 10FPS at say 50FPS average. That is not ideal and will need optimized for the 8800GT.
November 13, 2007 9:18:46 PM

spoonboy said:
eh? why not? ...if not how does their numbering scheme go then??


"Nice video competition is back. May I ask about ATI game incompatibility issue? I heard some bad ATI reputation on some games, are they common or just rare cases?"

There were some compatibility issues with the hd2000 series when they were released, but as of catalyst 7.10 - as far as I know - the only compatibility problems concern how well crossfire works. Sometimes it works quite well, a bit poorly or not at all. My own pet theory, but i think i read that somewhere aswell, is that previously all new drivers for the hd2000 cards were focused on improving compatibility and stability rather than performance, but catalyst 7.10 was the first solid attempt to improve performance. So alot is expected of catalyst 7.11, due out very soon with the new radeons.

7.12 will be the December driver then it will go to 8.1 in January. The first number is the last digit of the year i.e. 2007 and the second is the month number i.e January is month one and October is month 10
November 13, 2007 9:20:11 PM

elbert said:
Nice but your information particularly doesn't fit the scenario of the 8800GT either. Wider memory bus and more memory. Now where doest the 8800GT suffer that the GTX really shows its muscle? Is the the drivers the same for the 8800GT and the 8800GTX? If so should they be for optimal performance?

2.4FPS would be the worst possible mathematically possible running at an average of 100FPS. Most likely it would not go that low but with this lower change seen at the average of 16.77 FPS it could easily dip below 10FPS at say 50FPS average. That is not ideal and will need optimized for the 8800GT.


Im using the current beta drivers which are used for both the GTX and the GT. I havent seen any other reviews picking up on this minimum FPS issue. The GTX is known to most benefit over the GT at high resolutions, 1280x1024 doesnt come under that heading.

2.4 FPS is not the worst possible mathematical figure and you dont have the information available to hand to work out mathematically the minimum fps figure obtainable by turning down the graphics settings. Just because an average figure goes up by "x" amount doesnt mean that the min FPS follows the same curve! Until youve played with the graphics settings on crysis its hard to appreciate just how much difference those choices make. Crysis can kill graphics cards like no other game out there BUT with settings turned down it runs as well as any other game out there and LOW min FPS havent been reported for the 8800 GT in general, I think what you are seeing here is more of benchmark blip than a major issue with the 8800GT. Especially as its related specifically to beta drivers with some known issues, theres plenty of benchmarks being done on crysis on the 8800 GT that ARENT getting this bizarre min FPS issue!
November 13, 2007 10:10:18 PM

Ironnads said:
looks like the GT still owns it though..
Ryan adds


I have to agree to this. If you look close at the charts, the HD 3870 does better when in a Crossfire configuration, but when its single card against single card, the 8800GT does better most of the time. Of course, since the 8800GT is hard to get right now, at least we know that the HD 3870 is very competitive, provided its available in sufficient numbers.
!