Crysis: Demo vs Full Version Performance

stemnin

Distinguished
Dec 28, 2006
1,450
0
19,280
Ok, I wasn't nuts, I haven't run any benchmarks, but I swore the demo was much smoother (everything on high except shadows and shaders on medium it ran good in demo, but final version really really bad so I got everything to medium which is still getting less than demo).

PS Under the chart on the first page it says they perform the same, but its a 4-5 fps diff except for the highest res is that the same?

This isn't the first that i've played a demo that was way smoother than the final game.
 

cafuddled

Distinguished
Mar 13, 2006
906
1
18,985
Strange, I am using Vista X64 Ultimate and I get more performance in the full version than I do over the demo... Go figure. I even done a menchmark on the GPU settings and I was getting 27FPS thats 5FPS more than I was over the demo version. But indeed this could be more due to the 169.09 Nvidia drivers than the full game. I can test this once I get home as I still have the demo installed.
 

cruiseoveride

Distinguished
Sep 16, 2006
847
0
18,980
Crysis machine =

1x Quad cpu Supermicro X7QC3 board, Price = $1200
http://www.supermicro.com/products/motherboard/Xeon7000/7300/X7QC3.cfm

4x Xeon X7350 quads (total 16 cores) , Price = 4x $3000 = $12000
http://www.intel.com/products/processor/xeon7000/specifications.htm?iid=products_xeon7000+tab_specs

16x Crucial 4gb DDR2 sticks (64gb ram), Price = 16x $600 = $9600
http://www.crucial.com/store/partspecs.aspx?IMODULE=CT51272AF53E

2x Quadro FX 5600, Price 2x $3000 = $6000
http://www.nvidia.com/object/quadro_fx_5600_4600.html

Total = 1200 + 12000 + 9600 + 6000 = $28,800.00
+ some case , and psu(s) round it off at $30,000.00

And if that doesn't totally blow crysis away, then EA must have written the entire thing in java or something
 

Wobbly

Distinguished
Nov 16, 2007
5
0
18,510
the missing destruction that the article alluded to, when all settings are at low, can be restored by changing physics to medium.

with these settings and no AA, i get 50-55 fps.... AMD x2 Brisbane 3800 (auto o/c'ed 8% using asus o/c bios utility), x1900gt, 2 gigs of DDR2800, xp 32bit

AA makes it absolutely unplayable :fou:
 

blueeyesm

Distinguished
Feb 24, 2006
188
0
18,680
cruiseoveride,

Why did you list Quadros? They are optimized more for CAD work, than 3D gaming.

Also, would Crysis actually utilize 16 cores? It may work lovely w/ 4GB of RAM, but at 64GB, you could host the entire OS plus a variety of apps in a 40GB RAM drive image and still have plenty of memory not doing anything.
 

stemnin

Distinguished
Dec 28, 2006
1,450
0
19,280


Don't you know?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814133189&Tpk=quadro%2bfx5500

800+ FPS in BF2 and CRYSIS LMAO. "Tech Level High"
 

blueeyesm

Distinguished
Feb 24, 2006
188
0
18,680
Stemin:

Ohhhhh,... 800?? Well, there ya go.

Money well spent, then.


Agreed cafuddled. In fact I had it running on a Pentium M tablet running @ 1GHz w/ 512MB RAM for awhile.
 

caamsa

Distinguished
Apr 25, 2006
1,830
0
19,810
Ok how about some benchmarks of the full game with single, dual and quad core cpu's..................... :D

Don't forget the latest video cards.......and some old ones as well.
 

belgolas

Distinguished
Jun 16, 2007
4
0
18,510
you would find that crysis is not optimized by quads yet because another guy disabled 3 cores and it was the same FPS as with all 4 cores. Plus SLI isn't supported yet for crysis.
 

pauldh

Illustrious
I'm glad you tested this. I'm curious, Did you re-test the demo with the new drivers like the full game, or are those the old results with the old drivers? Drivers alone could rsult in + or - a couple fps.
 

V8VENOM

Distinguished
Dec 31, 2007
914
14
18,995
So let me get this straight (as if I didn't know) ... 10.1 fps at 1600 x 1200 4xAA DX10 on a reasonably good PC. So even if the hardware was available to double the "test PC" performance (on all fronts), we'd still only be at 20.2 fps at a miserable 1600 x 1200 4XAA.

So basically don't even think about running at 1920 x 1080 or 1920 x 1200 or 2550 x 1600 -- has anyone tested release version at these resolutions? What about 8XAA or 16XAA -- don't even bother? I know my MP Beta was a complete dog at these resolutions (as in single digit fps).

Also, I consider 40 fps a minimum for any twitcher game, so in order to get to 40 fps with details maxed we need 4X the current processing power of that test rig? Now, on an incredibly optimistic front even Intel's new CPU and the we hope soon to be released 9800GTX we may see a 2X improvement (with a ton of overclocking tossed at it) which is still only 20.2 fps.

I suppose I should be glad my ATI 2900XT 1GB Xfire cards run this just as badly as 8800GTX.

So I'm guessing the hardware to actually play this game at higher resolutions (1920+) at 40 fps is at least 4 years away? It's pretty unlikely I will be playing Crysis 4 years or so from now.

Anyone done tests with tossing more processors at it? At what point does one hit deminishing returns of more processors?

Anyway...if nothing else, it makes a good benchmarking tools for several years to come.

 

inthere

Distinguished
Jul 28, 2006
132
0
18,680
a couple of reviewers on this link http://www.newegg.com/Product/Product.aspx?Item=N82E16814133189&Tpk=quadro%2bfx5500 are saying the Nvidia Quatro 5500 is running Crysis at 800 fps at 1680x1250 res, I think they mean 1050 though. If thats true, I don't see 2500x1600 as being much of a problem.

There are rumors the Quatro 5600 will be an option for Mac Pros soon, it's unconfirmed but it seems pretty likely. Also, Crysis is using making use of extra cores, so 8 cores on a Mac Pro will be pretty nice.

It's just sucks that Ati is not competing at the high end anymore. If gamers find out Quatros are running Crysis superfast, Nvidia gets to charge $2500 per graphics card to at least 10 times their present Quatro user base. If Ati had released something that smoked the 8800gtx in time for Christmas, we would've had maybe a Geforce 9800 that was as powerful as the Quatro for about $600-700.

What I'm saying is if Ati had something faster than the 8800, that Quatro 5600 that was just released in October could have easily been the Geforce 9800gtx.
 

emp

Distinguished
Dec 15, 2004
2,593
0
20,780
Quadros are made for reliability, precision and quick multiple rendering, Geforces are made for super fast single app rendering (Precision here is not a must), people really need to stop thinking about quadros as super powerful cards just because of their price tag...
 

pip_seeker

Distinguished
Feb 1, 2006
437
0
18,780
This game is sooo much fun. Right now it's eerily similar to farcry, but with more options / tactics to kill the bad gize.

I really love the "cloak", it is so hilarious some of the stuff the bad gize say when they come looking for you.

As far as replayability goes, the verdict is still out on that for me. I'm at a point in the game where I "suspect" my previous actions before hand have influenced the game to act differently.

It's either that or it's a bug. I don't suspect a bug at this point because I'm still blasting the commy b@st@rds. :lol:

Additionally I am playing this with a 7600 GS with all fresh drivers. So don't think for one minute this game can't be played with lower end hardware. Just realize you will have to sacrifice some resolution and other higher end eye candy to do it. But given that it's still a very fun game. I have had some hiccups but mostly where the game saves itself at checkpoints.

Spec of my system:
Win XP Pro SP2 + all patches
Amd Athalon X2 3800+ [939 chipset]
1 GB Ram
Nvidia Geforce 7600 GS [+new drivers]
Resolution : 1024x768
Settings: [let game decide / default]

I have plans to rebuild, but I've decided to go with the Geforce 8800 GT and it makes no sense to build until I can get the card I want so I'll wait for now.

It's really a fun game, I absolutely love it so far. If you liked farcry you will like it too.

If you don't like farcry then skip it. This game will be talked about for some time to come, I really think it's a case of good things to come for all of us gamers... more specifically single player games.

Rock on Crytek! :bounce:
 

kcantrel

Distinguished
Oct 31, 2007
4
0
18,510
Maybe it is just me, but I don't think it was really utilizing all the cores. If it was, then all the lines would have been at the top. What I see is that when the yellow line is high, all the others are low, eventually they all got busy, but each one at about 25->30%. It would have been nice if an overall CPU utilization line would also been show. So, it looks like dual core, one for the game, one for the background stuff, is all that is needed.