Sign in with
Sign up | Sign in
Your question

8600GT benchmark with lost planet DX10

Last response: in Graphics & Displays
Share
May 16, 2007 2:33:46 AM

I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl. I will run it overclock my GT as the same clock speed as the GTS to check the difference. I play ingame and its not playable really with those fps, i tried withou the 4x Filter, its much better and i was able to play the game withou big lag. But still not really smooth and graphically not really good, but the physic and movement are really insane. I don't think so a 8600GT GTS will be okay to play dx10 with good graphic.
a b U Graphics card
May 16, 2007 2:38:19 AM

OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?
May 16, 2007 11:20:22 AM

Quote:
OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?


I clocked my 8600GT XXX to 690 mhz core and 835mhz memory, i can go up to 720mhz core and 860mhz memory but ill do it later.

I did the Dx9 benchmark and Dx10 in game at 1280x720 with those setting:

FPS = OFF
AntiAlias = OFF
HDR = Medium
Texture Filter Anistropic = 4X
Texture Resolution = High
Model Quality = High
Shadow Quality = Low
Shadow Resolution = Default
Motion Blur Quality = Low
Effect Resolution = High
Effect Quality = High
Effect Volume = Low
Lighting Quality = Medium
Display Resolution = 1280x720
Frequency = 60Hz
Full Screen = ON
Verticle Sync = OFF
Aspect Correction = OFF
Concurrent Operations = 2
Concurrent Rendering = ON
MultiCPU = OFF

DX9: 32 FPS SNOW , 26 FPS CAVE
DX10: 31 FPS SNOW , 26 FPS CAVE

Conclusion is about the same fps in dx9 and dx10, also i feel there not much difference really between dx9 and dx10, i guess you need to put higher filter to see a difference. But still it beat the HD2900XT in dx10 for now :-)
Related resources
May 16, 2007 11:23:50 AM

If you guys dont already know about this... here are some offical statements made.

Quote:
AMD's Comments on Lost Planet:

Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game.

NVIDIA's Comments on Lost Planet:

This week, Capcom will be releasing two versions of its Lost Planet: Extreme Condition demo for the PC. Both versions will contain the same content (i.e., no differences in game play), but one version will be DirectX 9 and the other DX 10. The latter version will be the world’s first downloadable DX 10 game demo. The demo also contains a system performance test. The test is an in-engine run-through of two levels of the game, complete with enemies, explosions and 3D meshes. The performance test shows your current FPS, average FPS, relevant system info (CPU and speed, graphics card, resolution and DX version) and, after a complete pass through of both levels, an average FPS for both the “Snow” and “Cave” sections. We think that this tester will be a useful tool for your testing purposes, as well as for your community.

Let's take a look at the image quality and see how ATI and NVIDIA does on this new DirectX 10 game demo.

May 16, 2007 11:27:49 AM

That why i say for now :-) But still they had lots of time to make their driver and prepared to play dx10 why its performing so bad now ?
a b U Graphics card
May 16, 2007 11:31:59 AM

OK, this isnt a true game, but its looking good. Actually DX10 SHOULD run better than DX9, if optimised correctly. I wonder what the 88s in DX9 vs DX10 look like?
May 16, 2007 11:33:09 AM

This bashing will continue but its partly AMD's fault that they were so late to the DX10 market. I'm sure once they launch the 2600 (and sell a few 2900s) the developers will start testing with both cards.
May 16, 2007 11:52:37 AM

No not really there's no game yet to take a full advantage of the DX10. You see when Crysis comes out later this year, it will stress out the current DX10 cards rights now. Hopefully ATI and Nvidia will have a better version of thier DX10 card.
May 16, 2007 12:29:33 PM

Quote:
I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl


lawl low resolution with barely playable framerates FTW lawl

I wish I could view a DX10 slideshow on my 1950PRO lawl


lawl


lawl


lawl

:roll:
May 16, 2007 12:34:59 PM

Quote:
No not really there's no game yet to take a full advantage of the DX10. You see when Crysis comes out later this year, it will stress out the current DX10 cards rights now. Hopefully ATI and Nvidia will have a better version of thier DX10 card.


I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10.

I would think the G80 series would be the card to have if you are basing your DX10 thoughts on Crysis, seeing as how the developers used the G80 to "develop" the game itself.

I fear ATI will have it rough when it is released as well. This may also be true for a great majority of DX10 first release titles since most developers only had G80 hardware available to develop from.

Its going to hurt ATI greatly for delaying way beyond just fans wanting their hardware yesterday.
May 16, 2007 6:17:35 PM

Quote:
I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl


lawl low resolution with barely playable framerates FTW lawl

I wish I could view a DX10 slideshow on my 1950PRO lawl


lawl


lawl


lawl

:roll:



Woah buddy calm down there. I can see you get really sensitive when people bash ATI. The only thing here is that he's not doing any bashing. He is simply stating that an 8600 (150$) can beat a HD2900XT (400$). He has evidence to his claim as well, so please, calm down.

Quote:


I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10


Umm, do you know that thus far everything that is done in DX10 can be done in DX9 (visuals-wise)?

So you couldn't really say that it looks "almost entirely DX10" when DX10 looks no different than 9.
May 16, 2007 6:23:56 PM

So is this game going to be like the OpenGL titles (Doom 3, Quake 4) were for the previous series of nV cards? If that's the case, it's not a very good test bed to compare nV with ATI.
May 16, 2007 7:18:29 PM

wow that is so cool i am actually really happy and that would mean that the Gf8600GTS would do better. the GF 8600GTS is the type of card i could afford and it will go pretty good, im happy
also im so board i am going to download the Dx9 demo, also what program can i use to measure FPS?
May 16, 2007 7:53:00 PM

Quote:
I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl. I will run it overclock my GT as the same clock speed as the GTS to check the difference. I play ingame and its not playable really with those fps, i tried withou the 4x Filter, its much better and i was able to play the game withou big lag. But still not really smooth and graphically not really good, but the physic and movement are really insane. I don't think so a 8600GT GTS will be okay to play dx10 with good graphic.


id ask the same but the 8600 vs 2900XT in call of juarez...
if the 8600 can even run it obviusly :p 
May 16, 2007 8:07:45 PM

Oh well i know the 8600 wont perform better in call of juarez Even the 8800 have problem with the filter. My point was just showing how the 8600 was performing in the first dx10 demo. Yes its low resolution but i removed the 4x filter and i was at 40 fps and very playable in game.
May 16, 2007 9:01:28 PM

Quote:




I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10


Umm, do you know that thus far everything that is done in DX10 can be done in DX9 (visuals-wise)?

So you couldn't really say that it looks "almost entirely DX10" when DX10 looks no different than 9.

umm, no the caves scene uses procedural geometry creation for the cave rock walls (DX10), smart particles for the waterfalls and insect flocking with physics (DX10) and hyper realistic shading for shading and displacement (DX10).

So really if you have not seen or run the demo, as I have than you would agree, it is mostly DX10.
a b U Graphics card
May 17, 2007 1:21:39 AM

Just because ATI had the time to make good drivers doesn't mean they did. ATI drivers are pretty bad still, so I wouldn't judge the 2900 yet. At least wait for the 8.38 drivers. nVidia has a 7 month lead on drivers, and designed this game, so they should do better.
May 19, 2007 3:14:44 AM

i have an evga 8600gt(stock clock speed) and i just d/l the trial. only problem was that i had to d/l 2 .dll files, no biggie. i ran some benchmarks too:

i left the settings default which is like 1284x720 resolution like the 1st post. i got around the same fps

22 fps in the snow

30 fps inside or somtin.

intel core 2 duo e6300@1.86. 1 gig pc5300 ram, 320 hd@7200rpm, EVGA 256mb 8600gt 560/1400, windows vista home premium.

i don't know what you guys are talking about but the game looks wonderful even when i put the detail on low. and it's very playable at 22 fps, i wasn't really paying attention doesn't really lag. wish i had a 8800gts so i would get around 40-60 but i guess i'm good with this card got it for $119.00
May 23, 2007 1:58:34 AM

My EVGA 8600GTS works fine with this game. (Nvidia driver 7.15.0011.5818).

Sure, some of the settings have to be turned down, but the game still looks very good. This game doesn't seem to suffer much visually with lower settings. Maybe I'm not as demanding as some..... :wink:

I downloaded the latest DX10 from Microsoft even though Vista has it already. A newer version of DX10 seemed to be needed.

At default settings....
Snow 31 FPS
Cave 24 FPS

Pentium D Dual Core 2.8 GHz
2 Gig RAM
May 23, 2007 2:58:55 AM

Quote:
OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?


I clocked my 8600GT XXX to 690 mhz core and 835mhz memory, i can go up to 720mhz core and 860mhz memory but ill do it later.

I did the Dx9 benchmark and Dx10 in game at 1280x720 with those setting:

FPS = OFF
AntiAlias = OFF
HDR = Medium
Texture Filter Anistropic = 4X
Texture Resolution = High
Model Quality = High
Shadow Quality = Low
Shadow Resolution = Default
Motion Blur Quality = Low
Effect Resolution = High
Effect Quality = High
Effect Volume = Low
Lighting Quality = Medium
Display Resolution = 1280x720
Frequency = 60Hz
Full Screen = ON
Verticle Sync = OFF
Aspect Correction = OFF
Concurrent Operations = 2
Concurrent Rendering = ON
MultiCPU = OFF

DX9: 32 FPS SNOW , 26 FPS CAVE
DX10: 31 FPS SNOW , 26 FPS CAVE

Conclusion is about the same fps in dx9 and dx10, also i feel there not much difference really between dx9 and dx10, i guess you need to put higher filter to see a difference. But still it beat the HD2900XT in dx10 for now :-)


Uh oh. I'm running 2 7900GTX's in SLi in DX9 and getting about 35fps in the snow, and about 40 in the caves.


Is that bad? :?
May 23, 2007 3:26:43 AM

SLI is not supported by the game yet.
May 23, 2007 3:31:15 AM

Oh, didn't know. My bad. :x


I don't think the game even stated what resolution it was at either. I don't remember if their was or not, but I would have put it at 1600x1200 with all eye candy turned up. :D 
December 15, 2007 4:34:58 PM

Hi,

I've just ordered a new PC, and was wondering if I will be able to play DX10 games with it.

The spec is...

Intel Core 2 QUAD Q6600
4GB DDR2 667Mhx RAM
500GB HDD
NVidia 8600GT
Creative XFi Platinum

Any info would be great, and is DX10 going to much better than DX9?
I've seen things saying yes and no?

Cheers

Simon
December 15, 2007 9:09:46 PM

DX10 will most likely run slower on most titles over DX9 (However in Lost Planet it runs at about the same last time I checked), your PC however won't be able to play games with DX10 well due to your weak graphics card. This game in particular is very poorly optimized and requires tremendous power to run it at playable speeds with high settings, so be prepared to lower settings at around medium or maybe less to keep it playable.

Also, try not to revive old topics like this next time :) 
December 16, 2007 4:36:03 PM

Love how people talk about thinks they have absoulutly nno clue about my 8600gt oc'ed to 751 core and 950 (1900)mem. can play this demo dx10 almost all settings on high
25-30 snow
35+ cave
those frame rate's are extremly playable unless you have mad optical sences.
December 17, 2007 12:18:56 AM

Well at what resolution? if you're playing at 800x600 or 1024x768 it doesn't count, I'm running at 1280x1024 with a 8800GTS and I don't get those frames with all on max.
December 17, 2007 1:53:07 AM

FYI i play at 1152x864 and i think games are plenty playable at that res. But i am also using a 14' in. diagonal viewable
!