VR-Zone has a benchmark for Crysis running on a 8800GTX SLI, QX6700, 4Gb Ram, Physics card and it only really seems playable on 1024x768. I think this is a crock because what did Nvidia use to run this game in their demo's that obviously had higher resolutions than this and there's The Lead designer that says at CES, Crysis was running 1280x768 on one 8800GTX. This may be just sayign though, with Nvidia's current drivers, SLI is a no go on Crysis. Just though you guys would like to know.
Linkage? Strange that VR-Zone would post something as questionable as that. I honestly don't think Crysis is at the stage where benchmarks can be run, driver issues are sure to be present and last I heard they were still running through code to optimize it for multithreading.
I dunno, sounds about right and it's still to early to judge. Once Nvidia release some new drivers, for the 8800, and probably a set of tweak ones at the launce of crysis , you might see better performance.
I noticed a few times during the video, that there was a slowdown in framerates. I thought then, ouch, if that thing stutters in a video demo, I might be hurting.
But why should it be a surprise. Far Cry (Crytek) pushed the gaming envelope in 2004 and even now almost 3 years later, I can play Fay Cry on Max resolution on a X1900XTX and still drop down to 35 fps in some areas.
The Cry2 engine is looking to make a name for itself, just like the Doom3 engine and HL2. The game is written for the next 2-3 years. If the 8800GTX got 100fps on it all the way through, it wouldn't be much of a cutting edge game
Well the thing is though is that the developers, (Chevat, Jack Mamais,) claims much higher performance with the system in DX10 and a single 8800 GTX. That's all that they were running at CES and those videos had no slow downs until explosions and such. All the slow downs seem to come from Crytek engine having problems with the Havok engine. Remember pre-alpha, so these things may be fixed. I'm thinking though that this is showing that SLI has it problems in DX10 right now.
I think I remember hearing that it is a different driver for DX10 SLI to single card. Also Jack said that these CES demo's were running at 60+ FPS.
Well two things that are of concern, Crytek said they are going with the VPU physics of HavokFX using their own build on the engine, not PPU assisted physics.
Yet they turn Physics on in the benchies and because they added a Phys_X card which may do nothing at all. That to me would mean they are using the other GPU for physics, which would explain low performance for what you'd expect from full SLi (also increases the load on the other remaining card by adding physics effects).
"We will not support AGEIA PhysX since we are capable of creating our own very strong and competitive physics engine." - Crytek
That's all that they were running at CES and those videos had no slow downs until explosions and such.
They you weren't watching the same demo I was, the one with the walk/talkthrough had a bunch of points where there was noticeable slowdown. Like when he walked back into the building from outside. Also from the look of it it looked very 1280x720, not much higher than that. Looking att the 1280x1024 results that looks about right.
What are you talking about? I read a majority of the threads here and I don't ever see people gloating about how their 8800 can run Crysis or cutting edge games and your card can't.
Looking downhill? Everything is looking downhill when you are on the top of the performance mountain with an 8800GTX. Not sure what triggered your rant, but calm yourself a bit, you saying "It's just as I predicted" when you probably didn't predict it anywhere other than in your own hindsight bias, is starting to sound quite BaronMatrix-esque.
Regardless of whether the benchmarks are legitimate or not is questionable, but one thing to consider is we have all seen the current benchmarks of games run on XP vs Vista and the performance hit that came along with running Vista. That low FPS could easily be associated to numerous problems, included
1)the game isn't even finished yet, still what 11 months away!
2) official forceware dx10 drivers haven't even been released yet
3) the game hasn't been optimized that well to run with vista, i mean as far as we know they probably started developing this game with XP in mind and now all of the sudden a whole new OS is to be used.
there are many other factors that could be included, regardless no one here should take any of those numbers seriously since neither officially supported drivers are even out and the game hasn't even been released for BETA testing.
Agreed. Thanks for expanding on my first reply. Though in a new thread it appears Vista gaming (more or less) is on par with XP now for the games that GameSpot benchmarked. Who knows which games will work well and which won't, but it's good to see optimized FPS running the final released version of Vista.
A rather questionable benchmark indeed. For a start, they shouldn't be needing any physics card in the rig, because *crysis uses its own physics engine* which runs on the cpu: http://www.bit-tech.net/news/2006/05/11/crysis_playtest
You can find more on the net about Crysis using in-house physics engine. This alone may be an indication that this whole benchmark thing is dubious.
It should also be noted that graphically Far Cry was a very ambitious game, and it took some time before there were graphics cards that could run it smoothly at high resolutions. I wouldn't expect any less from Crysis based on the previous title. Let's face it- outdoor scenes are way harder to render than indoor scenes. Look at Oblivion. Nonetheless the game is 11 months away, Vista was just released, and the drivers are still being refined. What I'd like to see is some DX9 benchies.