Sign in with
Sign up | Sign in
Your question

Post your DX10 Lost Planet Benchs

Last response: in Graphics & Displays
Share
May 16, 2007 4:34:34 AM

I just finished downloading, took forever wouldn't go over 60K..., servers must be flooded.

The graphics are a step above DX9 thats for sure.

E6400 @ 3.6Ghz 8800GTX x 2 SLI, 158.42 drivers from guru3d.com

It scored 58FPS average in the Snow and 55FPS in the Cave @ 1920x1080 x32bit (A2RGB10)

Here is my bench results at these settings, (default):
FPS = OFF
AntiAlias = 4X
HDR = Medium
Texture Filter Anistropic = 16X
Texture Resolution = High
Model Quality = High
Shadow Quality = Medium
Shadow Resolution = Default
Motion Blur Quality = High
Effect Resolution = High
Effect Quality = High
Effect Volume = High
Lighting Quality = High
Display Resolution = 1920x1080
Frequency = 60Hz
Full Screen = ON
Verticle Sync = OFF
Aspect Correction = OFF
Concurrent Operations = 2
Concurrent Rendering = ON
MultiCPU = OFF (?, don't know what this meant)

It was almost dizzying and seemed to surge like it was struggling to load those massive textures even though the FPS remained high.
May 16, 2007 5:11:09 AM

8O
Looks pretty low for a 8800GTX x 2 SLI setup, looks like it will be unplayable for 8800GTS users at that resolution :( 
May 16, 2007 5:17:42 AM

I thought there was an issue with G80 SLI and vista? We'll I'm sure waiting for refreshes.
Related resources
May 16, 2007 10:53:58 AM

Quote:

MultiCPU = OFF (?, don't know what this meant)


Its not MultiCPU its Multi-GPU, Put it ON since you have SLI config, you should get lots more fps.
a b U Graphics card
May 16, 2007 11:39:35 AM

Could you run the same config in the DX9 mode? And give us the benches?
May 16, 2007 12:19:50 PM

Quote:
Could you run the same config in the DX9 mode? And give us the benches?


I was going to do that with DX9 in XP but when I started the Dnld for both of them at the same time the speed dropped to 24K 8O , I would probably still be downloading.

I am getting those tonight, maybe the servers will be faster or some mirrors will pop up.

Just doing this out of curiosity. I'm still a XP SP2 user. Vista U, is just a testbed for Crysis as far as I'm concerned.
May 16, 2007 12:48:38 PM

you say you ran it at 1920x1080 x32bit (A2RGB10) - did you use the config hack? because thats not one of the resolutions defined

Also did you use one of the other SLI configs for this game as in the past was required for Vegas and what not

Your overclock for the 6400 seems low for water cooling, not pushing you to change it, but im running it at 2.7 on air and i know it will easily go over 3.0 on air no probs, just saying you might eek out some more FPS is all

Im just curious as i dont even have a 8800 or 2900 for DX10, but is there a dx10 control panel as there was in dx9, and if so how is it launched?
a b U Graphics card
May 16, 2007 12:56:14 PM

snow 45 cave 56

e6600 @3.45 (sorry, said 4.5 earlier)
8800 gts (640) stock settings
2 gig ram
vista
dx 10

default settings on game test 1280, most options on high or medium

The game settings menu is crazy for me, when I move mouse over it the menu goes crazy and it is almost impossible to make any changes.

Also, my keyboard won't work in game so I can't actually play it.

Looks good though
May 16, 2007 1:03:27 PM

Quote:
you say you ran it at 1920x1080 x32bit (A2RGB10) - did you use the config hack? because thats not one of the resolutions defined

Also did you use one of the other SLI configs for this game as in the past was required for Vegas and what not

Your overclock for the 6400 seems low for water cooling, not pushing you to change it, but im running it at 2.7 on air and i know it will easily go over 3.0 on air no probs, just saying you might eek out some more FPS is all

Im just curious as i dont even have a 8800 or 2900 for DX10, but is there a dx10 control panel as there was in dx9, and if so how is it launched?


Not sure what you mean by config hack? This is the resolution the game auto configured itself to 1920x1080. It is my desktop resolution. This may be a fix of the 158.42 drivers.

On SLI I just enabled SLI in the Nvidia control panel, again its part of the .42 drivers.

I may try 3.7 or higher on the C2D tonight and try the enable multiGPU part to see if it helps.

Also, I don't have my RivaTuner installed in VISTA and with that I can enable my GTX fans to 100 and my cards to 649GPU/2052Mem which would likely give me a 10FPS boost all around. The cards are running stock in VISTA.
May 16, 2007 1:23:48 PM

Quote:
snow 45 cave 56

e6600 @4.5
8800 gts (640) stock settings
2 gig ram
vista
dx 10

default settings on game test 1280, most options on high or medium

The game settings menu is crazy for me, when I move mouse over it the menu goes crazy and it is almost impossible to make any changes.

Also, my keyboard won't work in game so I can't actually play it.

Looks good though



e6600 at 4.5? How did you manage that? If its true, thats pretty impressive, but I doubt its stable.
May 16, 2007 1:29:35 PM

SLI is currently not supported under Direct X10, which would explain why your framerate wasn't phenomenal.
May 16, 2007 1:49:25 PM

Quote:
snow 45 cave 56

e6600 @4.5
8800 gts (640) stock settings
2 gig ram
vista
dx 10

default settings on game test 1280, most options on high or medium

The game settings menu is crazy for me, when I move mouse over it the menu goes crazy and it is almost impossible to make any changes.

Also, my keyboard won't work in game so I can't actually play it.

Looks good though



e6600 at 4.5? How did you manage that? If its true, thats pretty impressive, but I doubt its stable.

Yeah I saw that too and was wondering the same thing.
May 16, 2007 1:54:41 PM

Quote:
snow 45 cave 56

e6600 @4.5
8800 gts (640) stock settings
2 gig ram
vista
dx 10

default settings on game test 1280, most options on high or medium

The game settings menu is crazy for me, when I move mouse over it the menu goes crazy and it is almost impossible to make any changes.

Also, my keyboard won't work in game so I can't actually play it.

Looks good though



e6600 at 4.5? How did you manage that? If its true, thats pretty impressive, but I doubt its stable.

Yeah I saw that too and was wondering the same thing.

He also doesn't give his resolution run so its not real helpful??
a b U Graphics card
May 16, 2007 2:12:46 PM

My bad. That would be 3.45 :wink:
I'm a little dyslexic with numbers.

Resolution is 1280 x 740.
May 16, 2007 2:14:43 PM

I don't have the big V myself but I'll download the demo when I go home for all us XP fans out there. Granted I'm mainly just waiting for Vista to release a service pack. I'll let you guys know how it goes.

I'm also sporting an E6600 at 3.5 on a Tuniq Tower and I have an 8800GTX so I should be able to run it well at the very least on XP. Granted I'm not a fan of console to PC ports but I'll check it out none the less.
May 16, 2007 2:18:07 PM

Quote:
snow 45 cave 56

e6600 @4.5
8800 gts (640) stock settings
2 gig ram
vista
dx 10

default settings on game test 1280, most options on high or medium

Good show for a 8800GTS 640MB. :D 
Still far better than 16FPS by HD2900XT. :roll:
I know that i am going to get flamed by ATI FanBoys! :?
a b U Graphics card
May 16, 2007 2:33:16 PM

1600 x 1000 snow 30 cave 44
1280 x 720 snow 45 cave 56
1280 x 720 (oc 614/893) snow 56 cave 61
1280 x 960 (oc 617 x 921) snow 45 cave 61


e6600 @ 3.45
2 gig ram
vista dx 10
gts 8800 (640) stock

the 1600 x 1000 was highest option in the menu for me.
May 16, 2007 2:39:24 PM

Quote:
1600 x 1000 snow 30 cave 44
1280 x 720 snow 45 cave 56

e6600 @ 3.45
2 gig ram
vista dx 10
gts 8800 (640) stock


I don't know about you but I thought it was a pretty heavy duty bench/demo. The cards take a beating in the snow areas. There is just TONS of snow falling and blowing around you can hardly see the creatures coming at you at times.

I think if I were to play the game I would have to tone down all the falling and swirling snow and I'm not sure about the motion blur aspect yet either. Its kind of nauseating. Those two things alone would likely push FPS up I would think.
a b U Graphics card
May 16, 2007 2:44:58 PM

Quote:
1600 x 1000 snow 30 cave 44
1280 x 720 snow 45 cave 56

e6600 @ 3.45
2 gig ram
vista dx 10
gts 8800 (640) stock


I don't know about you but I thought it was a pretty heavy duty bench/demo. The cards take a beating in the snow areas. There is just TONS of snow falling and blowing around you can hardly see the creatures coming at you at times.

I think if I were to play the game I would have to tone down all the falling and swirling snow and I'm not sure about the motion blur aspect yet either. Its kind of nauseating. Those two things alone would likely push FPS up I would think.

I think I'd have to play the game a bit before I could decide. Just watching the demo I had no problem. My keyboard won't work in game or in menu. Did you have any problem like that?

I've got an old Microsoft cordless keyboard. I'll try a basic ps2 keyboard later.
May 16, 2007 3:17:21 PM

I think all the people with DX10 got FUCKED! they cant even run lost planet at all details high with some filters at a decent resolution with above 40FPS then what are you going to do when crysis will come or alan awake?! did you see any screenshots from crysis?! they look ten times better then lost planet so how you going to run that?
May 16, 2007 3:17:41 PM

Game won't launch for me. AHHHH had to go to work, will try to figure out later. You guys don't help though about my anticipation about this! I'm getting dx10 errors. not sure. figure out later.
May 16, 2007 3:31:13 PM

i m speechless after i saw all of your benchmarks, even with high end computer this game is so demanding. and i think crysis will be even more demanding. is it that the video cards lacking or the cpu now?? dx10 games are crazy, maybe i should wait for 2nd gen dx10 cards now!!
May 16, 2007 4:13:54 PM

Quote:
you say you ran it at 1920x1080 x32bit (A2RGB10) - did you use the config hack? because thats not one of the resolutions defined

Also did you use one of the other SLI configs for this game as in the past was required for Vegas and what not

Your overclock for the 6400 seems low for water cooling, not pushing you to change it, but im running it at 2.7 on air and i know it will easily go over 3.0 on air no probs, just saying you might eek out some more FPS is all

Im just curious as i dont even have a 8800 or 2900 for DX10, but is there a dx10 control panel as there was in dx9, and if so how is it launched?


Not sure what you mean by config hack? This is the resolution the game auto configured itself to 1920x1080. It is my desktop resolution. This may be a fix of the 158.42 drivers.

On SLI I just enabled SLI in the Nvidia control panel, again its part of the .42 drivers.

I may try 3.7 or higher on the C2D tonight and try the enable multiGPU part to see if it helps.

Also, I don't have my RivaTuner installed in VISTA and with that I can enable my GTX fans to 100 and my cards to 649GPU/2052Mem which would likely give me a 10FPS boost all around. The cards are running stock in VISTA.
im using the latest official nvidia drivers for the 7x series 158.18 i think it is, the games max resolution is only above 1000 even though my deskto resolution is set to 1680 x 1050 so im not sure if yours is actualy set to that or not, take a screenshop and paste it in paint and check the resolution

Anyway the hack is -

"i found a way how to add custom res. if u on vista go to C:\Users\YourName\AppData\Local\capcom\lostplanetT rial then look for config and put any res u want and save read only if u want to. DO NOT GO TO OPTION MODE IF U PUT THE CUSTOM RES." - taken form the Lost Planet thread on guru3d, not sure if i should be linking to other forums or not so ill not post the link

Im asking you these questions cos im thinking of getting a GTX within the next 2 months and id like to know if these DX10 games will actually be playable.

As far as i know setting SLI in the nvidia control panel just sets generic SLI, but some people when a profile isint available for a game, like fear when it came out and Rainbow 6 Vegas used SLI profiles for other games to improve performance, i cant check it out now, i was just wondering if youve tried anything like that? Also this being a DX10 game it may not be possible to use DX9 SLI profiles with it, bleeding edge eh? lol
May 17, 2007 12:56:16 AM

Quote:
im using the latest official nvidia drivers for the 7x series 158.18 i think it is, the games max resolution is only above 1000 even though my deskto resolution is set to 1680 x 1050 so im not sure if yours is actualy set to that or not, take a screenshop and paste it in paint and check the resolution

Anyway the hack is -

"i found a way how to add custom res. if u on vista go to C:\Users\YourName\AppData\Local\capcom\lostplanetT rial then look for config and put any res u want and save read only if u want to. DO NOT GO TO OPTION MODE IF U PUT THE CUSTOM RES." - taken form the Lost Planet thread on guru3d, not sure if i should be linking to other forums or not so ill not post the link

Im asking you these questions cos im thinking of getting a GTX within the next 2 months and id like to know if these DX10 games will actually be playable.

As far as i know setting SLI in the nvidia control panel just sets generic SLI, but some people when a profile isint available for a game, like fear when it came out and Rainbow 6 Vegas used SLI profiles for other games to improve performance, i cant check it out now, i was just wondering if youve tried anything like that? Also this being a DX10 game it may not be possible to use DX9 SLI profiles with it, bleeding edge eh? lol


I will have to take screen shot later as I am back to XP currently. Again, the game in both XP DX9 and VISTA U DX10 show my resolution as being 1920x1080 so I still don't understand the limitation problem. Here is the screenshots from XP:



I have since rerun my tests with MultiGPU enabled and this is my final score for VISTA DX10, Driver updated as recommended by Capcom site to 158.43: Snow 65 Cave 57 average FPS's , not a huge improvement but better

I have finished the XP tests also MultiGPU enabled driver 158.19: Snow 71 and Cave 62 average FPS's. Performance is noticeably better.

I have to apologize to someone here I may have been snotty about the difference between DX10 and DX9 visually. There really isn't a huge difference. VISTA is still a resource hog and the game is much smoother and fluid on XP. The score differences between XP and VISTA do not indicate just how much more fluid the game is in XP.
May 17, 2007 1:25:40 AM

I wasn't all that impressed either it felt like an Xbox game and certainly not something I would spend money on. COJ was far more fun and wouldn't waste money on that either. The washed out overexposure was the final nail in the coffin for me. When is that Crysis demo coming out already.


Vista 64 1600x1200 45ish FPS in the snow with the recommended setting.
May 17, 2007 1:27:02 AM

Quote:
i m speechless after i saw all of your benchmarks, even with high end computer this game is so demanding. and i think crysis will be even more demanding. is it that the video cards lacking or the cpu now?? dx10 games are crazy, maybe i should wait for 2nd gen dx10 cards now!!


I don't know what engine that run on lost planet, but i havent seen really a good game on pc from capcom, even in game there button A, B from xbox so i doubt the game is really optimized for PC/DX10 compared to what Crysis will be. I can guarantee that Crysis will need very less requirement then lost planet and play lots faster and better quality.
May 17, 2007 2:41:37 AM

Using all defaults except 1280x960 resolution under DX10/Vista32, I got some weird results -
With my CPU OC'd to 3660, 8800GTS-320 @ 580 core/ 1700 memory:
snow 42, cave 31

With my CPU and GPU at stock speed (2667 CPU, 513/1580 GPU):
snow 61, cave 53

WTF??? Every other game I have (especially Oblivion) goes up in response to overclocks, but this one went down?!? In both stock and overclocked configuration, the CPU usage was fairly low, under 50% the whole time.

Incidentally, even with the semi-respectable framerates, the game didn't "feel" smooth at all when overclocked. This game "felt" about like Oblivion did on my old P4 with a 9800 Pro, even though the numbers said that it was 4 times more FPS than that old setup.
May 17, 2007 3:10:21 AM

Quote:
im using the latest official nvidia drivers for the 7x series 158.18 i think it is, the games max resolution is only above 1000 even though my deskto resolution is set to 1680 x 1050 so im not sure if yours is actualy set to that or not, take a screenshop and paste it in paint and check the resolution

Anyway the hack is -

"i found a way how to add custom res. if u on vista go to C:\Users\YourName\AppData\Local\capcom\lostplanetT rial then look for config and put any res u want and save read only if u want to. DO NOT GO TO OPTION MODE IF U PUT THE CUSTOM RES." - taken form the Lost Planet thread on guru3d, not sure if i should be linking to other forums or not so ill not post the link

Im asking you these questions cos im thinking of getting a GTX within the next 2 months and id like to know if these DX10 games will actually be playable.

As far as i know setting SLI in the nvidia control panel just sets generic SLI, but some people when a profile isint available for a game, like fear when it came out and Rainbow 6 Vegas used SLI profiles for other games to improve performance, i cant check it out now, i was just wondering if youve tried anything like that? Also this being a DX10 game it may not be possible to use DX9 SLI profiles with it, bleeding edge eh? lol


I will have to take screen shot later as I am back to XP currently. Again, the game in both XP DX9 and VISTA U DX10 show my resolution as being 1920x1080 so I still don't understand the limitation problem. Here is the screenshots from XP:



I have since rerun my tests with MultiGPU enabled and this is my final score for VISTA DX10, Driver updated as recommended by Capcom site to 158.43: Snow 65 Cave 57 average FPS's , not a huge improvement but better

I have finished the XP tests also MultiGPU enabled driver 158.19: Snow 71 and Cave 62 average FPS's. Performance is noticeably better.

I have to apologize to someone here I may have been snotty about the difference between DX10 and DX9 visually. There really isn't a huge difference. VISTA is still a resource hog and the game is much smoother and fluid on XP. The score differences between XP and VISTA do not indicate just how much more fluid the game is in XP.
very nice, many thanks for honest impression, because youve basically justified what we've all said all along, smoother game is better than all the nice stuff and it seems while static dx10 screens look great moving in DX9 the difference is negligible and for the moment( we cant confirm this games optimisation level but judging by capcoms previous PC conversion titles hmph....... )

I just figured it out the res youre playing at is the max determined by your screen in HD mode, i hope these games for vista wont be like that and will hopefuly allow the rest of us to set our own native resolutions
a b U Graphics card
May 17, 2007 3:29:21 AM

Heres a tip from tweakguides
Quote:
If you're experiencing a hard lock crash/freeze during the game, try setting Concurrent Rendering to Off. This can fix the problem and shouldn't noticeably reduce FPS.


This is s great site. Theyre going to have a tweakguide for the drmo soon Heres the link http://www.tweakguides.com/ It may help a lil for some of you
May 17, 2007 3:51:17 AM

E6300 @ 1.86ghz and 8800GTS Superclocked @1280x1024 with a 19" LCD

Snow 51fps Cave 37fps

Much room to overclock .. 8)

158.22 Nvidia Drivers
May 17, 2007 4:44:02 AM

Anyone here know how to get this Demo to install on a system running Windows XP 64 pro?
May 17, 2007 4:50:27 AM

Quote:
Anyone here know how to get this Demo to install on a system running Windows XP 64 pro?

You can't use the DX10 version on XP - Vista only. The DX9 version should work.
May 17, 2007 1:01:49 PM

That is the problem though.. I downloaded the direct x 9 version and it wont install.. says its not compatible with the operating system.
May 17, 2007 1:09:48 PM

Quote:
That is the problem though.. I downloaded the direct x 9 version and it wont install.. says its not compatible with the operating system.


I installed the DX9 version last night to compare with Vista DX10. I recieved a DX9 file not found/missing type of error when I tried to run. I did a google search with the error text in it and I found out that an updated version of DX9 has to be downloaded from Microsoft and installed before I could run the Game/Demo/Bench.

I don't know if this is related since the XP DX9 version installed fine but it was only when I tried to run it that the error appeared.
May 18, 2007 11:10:17 PM

Release Notes:

Quote:
* Please note this driver only supports the GeForce 8800 series GPUs
* Improved 3D performance for single GPU and NVIDIA SLI
* Add support for NVIDIA SLI on DirectX 10
* Numerous game and application compatibility fixes (see below for some of the top issues)
* Updated NVIDIA Control Panel with improved user interface (Please see the Release Notes for more details).
* Numerous game and application compatibility fixes.
* This driver supports the following 3D features:
o Single GPU support
+ DirectX 9 support for GeForce 6/7/8 series GPUs
+ DirectX 10 support for GeForce 8 series GPUs
+ OpenGL support for GeForce 6/7/8 series GPUs
o NVIDIA SLI support
+ DirectX 9 support for 6/7/8 series GPUs
+ DirectX 10 support for 8800 series GPUs
+ OpenGL support for 6/7/8 series GPUs
* DirectX 10 NVIDIA SLI support for GeForce 8 series GPUs will be available in a future driver
* If you would like to be notified of upcoming drivers for Windows Vista, please subscribe to the newsletter


Is it just me or does that make no sense?

When you were benchmarking LOST PLANET, how sure are you it was truely using both your cards? Did you do a test with only 1card, or SLI Disabled to see if there was any difference?
May 19, 2007 3:59:29 AM

I let the Lost Planet Performance Test play for about 15 minutes.
Default settings




Max Settings (i guess)




I played the AntiAlias C16XQ (I guess thats Coverage sample 16xQ) setting in the LP demo game mode and it lockedup with boxes of LP graphics thumbnails (no artifacts) after defeating the first bug "boss". Had to shut down and enter bios to get my settings back.

This game reminds me of the movie Starship Troopers.
Anonymous
May 19, 2007 4:31:04 AM

i got 7 to 8 framerate with 8800GTS. i used drivers 158.18. strange thing is my CPU usage was just 30 to 40%
Anonymous
May 19, 2007 4:32:44 AM

thats multi GPU option not CPU. you should enable it you have two 8800GTX cards
Anonymous
May 19, 2007 4:37:29 AM

didn't see any ATI 2900XT benchy lol. what happened
May 19, 2007 2:22:36 PM

Here's my test (E6700 stock, 8800GTX, 4GB, Vista x64 Ultimate)

Snow: 36, Cave: 49
...1600x1200, 4x AA, 16x AF, Medium HDR, Shadows Low. Everything else Maxed out. Concurrency set to 2.

Seems that AA is the big killer on the Snow level. If I disable AA, I get Snow: 50, Cave 51.

Tried turning off Shadows, but (surprisingly), only got a 2 fps increase. Turning HDR from medium to low doesn't do much either.


Oddly, I noticed that during the Cave test, I could hear my CPU fan speed up. So I went and looked at the performance graph in Task Manager after the test (that I left running in the background). It apppears that even though the game's Concurrency setting was at 2, my utilization during the snow test is 50%, but increases to 100% during Cave test. So the concurrency setting must only be used for certain effects used by certain levels.
Anonymous
May 19, 2007 2:44:47 PM

werid thing is this demo is not using CPU to the fullest. then again it's just a demo. but if the game is going to be like this when it comes it bye bye G80 and R600. we need R700 and G90
May 19, 2007 3:31:29 PM

Quote:
werid thing is this demo is not using CPU to the fullest.
It is for the Cave part of the performance test. (see my post previous to yours)
May 21, 2007 12:48:30 PM

Quote:
Release Notes:

* Please note this driver only supports the GeForce 8800 series GPUs
* Improved 3D performance for single GPU and NVIDIA SLI
* Add support for NVIDIA SLI on DirectX 10
* Numerous game and application compatibility fixes (see below for some of the top issues)
* Updated NVIDIA Control Panel with improved user interface (Please see the Release Notes for more details).
* Numerous game and application compatibility fixes.
* This driver supports the following 3D features:
o Single GPU support
+ DirectX 9 support for GeForce 6/7/8 series GPUs
+ DirectX 10 support for GeForce 8 series GPUs
+ OpenGL support for GeForce 6/7/8 series GPUs
o NVIDIA SLI support
+ DirectX 9 support for 6/7/8 series GPUs
+ DirectX 10 support for 8800 series GPUs
+ OpenGL support for 6/7/8 series GPUs
* DirectX 10 NVIDIA SLI support for GeForce 8 series GPUs will be available in a future driver
* If you would like to be notified of upcoming drivers for Windows Vista, please subscribe to the newsletter


Is it just me or does that make no sense?

When you were benchmarking LOST PLANET, how sure are you it was truely using both your cards? Did you do a test with only 1card, or SLI Disabled to see if there was any difference?

None of their Vista drivers have ever made sense to me either and I did not test with just one card. There is a possibility it was only running one card although the scores I have gotten are higher than most peoples single card GTX scores but lower than XP scores by about 4-5fps.

If you read above a ways I posted some screenshots of XP with the SLI indicators running. I have those same SLI indicators running under VISTA. This leads me to believe SLI is being recognized by the game and making some use of it.
May 21, 2007 12:55:02 PM

Quote:
didn't see any ATI 2900XT benchy lol. what happened


ATI folks are saying its an Nvidia bench, which it is, so its "unfair" for them. I think they are refusing to test it. Or maybe its because everytime the 2900XT goes under load it goes all Chernobyl on them with heat and noise. :wink:
Anonymous
May 21, 2007 4:00:39 PM

yeah. i mean i agree lost planet is optimized for Nvidia. but ATI 2900Xt should atleast run it. optimized mean Nvidia cards will run faster in that game lol not that ATI cards can't even RUN it. as per the press release lol another lie from AMD
June 1, 2007 1:51:13 AM

I have all the default quality settings and 1920x1200 fullscreen.

69caves :D 

50snow :?:

Why does the Test loop and I cant quit it? I have to do ctlr_alt_dlt to get out of it??????????????????

BTW. mine too was 1920x1080 at first, and then I reset it reset it to 1920x1200 in the options and it worked fine. BTW, when it marked 1920x1080 it truely was 1080, not 1200. 1200 was a MAJOR image quality difference.
June 1, 2007 5:15:48 AM

Quote:
Why does the Test loop and I cant quit it? I have to do ctlr_alt_dlt to get out of it??????????????????

I have long since deleted that crappy demo so I can't test this, but I remember there was a simple but completely non-obvious way to stop the test mode - right clicking maybe?
June 1, 2007 11:23:21 AM

Quote:
Why does the Test loop and I cant quit it? I have to do ctlr_alt_dlt to get out of it??????????????????
Just right-click the mouse.
!