8600GT benchmark with lost planet DX10

I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl. I will run it overclock my GT as the same clock speed as the GTS to check the difference. I play ingame and its not playable really with those fps, i tried withou the 4x Filter, its much better and i was able to play the game withou big lag. But still not really smooth and graphically not really good, but the physic and movement are really insane. I don't think so a 8600GT GTS will be okay to play dx10 with good graphic.
26 answers Last reply
More about 8600gt benchmark lost planet dx10
  1. OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?
  2. Quote:
    OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?


    I clocked my 8600GT XXX to 690 mhz core and 835mhz memory, i can go up to 720mhz core and 860mhz memory but ill do it later.

    I did the Dx9 benchmark and Dx10 in game at 1280x720 with those setting:

    FPS = OFF
    AntiAlias = OFF
    HDR = Medium
    Texture Filter Anistropic = 4X
    Texture Resolution = High
    Model Quality = High
    Shadow Quality = Low
    Shadow Resolution = Default
    Motion Blur Quality = Low
    Effect Resolution = High
    Effect Quality = High
    Effect Volume = Low
    Lighting Quality = Medium
    Display Resolution = 1280x720
    Frequency = 60Hz
    Full Screen = ON
    Verticle Sync = OFF
    Aspect Correction = OFF
    Concurrent Operations = 2
    Concurrent Rendering = ON
    MultiCPU = OFF

    DX9: 32 FPS SNOW , 26 FPS CAVE
    DX10: 31 FPS SNOW , 26 FPS CAVE

    Conclusion is about the same fps in dx9 and dx10, also i feel there not much difference really between dx9 and dx10, i guess you need to put higher filter to see a difference. But still it beat the HD2900XT in dx10 for now :-)
  3. If you guys dont already know about this... here are some offical statements made.

    Quote:
    AMD's Comments on Lost Planet:

    Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game.

    NVIDIA's Comments on Lost Planet:

    This week, Capcom will be releasing two versions of its Lost Planet: Extreme Condition demo for the PC. Both versions will contain the same content (i.e., no differences in game play), but one version will be DirectX 9 and the other DX 10. The latter version will be the world’s first downloadable DX 10 game demo. The demo also contains a system performance test. The test is an in-engine run-through of two levels of the game, complete with enemies, explosions and 3D meshes. The performance test shows your current FPS, average FPS, relevant system info (CPU and speed, graphics card, resolution and DX version) and, after a complete pass through of both levels, an average FPS for both the “Snow” and “Cave” sections. We think that this tester will be a useful tool for your testing purposes, as well as for your community.

    Let's take a look at the image quality and see how ATI and NVIDIA does on this new DirectX 10 game demo.

  4. That why i say for now :-) But still they had lots of time to make their driver and prepared to play dx10 why its performing so bad now ?
  5. OK, this isnt a true game, but its looking good. Actually DX10 SHOULD run better than DX9, if optimised correctly. I wonder what the 88s in DX9 vs DX10 look like?
  6. This bashing will continue but its partly AMD's fault that they were so late to the DX10 market. I'm sure once they launch the 2600 (and sell a few 2900s) the developers will start testing with both cards.
  7. No not really there's no game yet to take a full advantage of the DX10. You see when Crysis comes out later this year, it will stress out the current DX10 cards rights now. Hopefully ATI and Nvidia will have a better version of thier DX10 card.
  8. Quote:
    I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl


    lawl low resolution with barely playable framerates FTW lawl

    I wish I could view a DX10 slideshow on my 1950PRO lawl


    lawl


    lawl


    lawl

    :roll:
  9. Quote:
    No not really there's no game yet to take a full advantage of the DX10. You see when Crysis comes out later this year, it will stress out the current DX10 cards rights now. Hopefully ATI and Nvidia will have a better version of thier DX10 card.


    I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10.

    I would think the G80 series would be the card to have if you are basing your DX10 thoughts on Crysis, seeing as how the developers used the G80 to "develop" the game itself.

    I fear ATI will have it rough when it is released as well. This may also be true for a great majority of DX10 first release titles since most developers only had G80 hardware available to develop from.

    Its going to hurt ATI greatly for delaying way beyond just fans wanting their hardware yesterday.
  10. Quote:
    I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl


    lawl low resolution with barely playable framerates FTW lawl

    I wish I could view a DX10 slideshow on my 1950PRO lawl


    lawl


    lawl


    lawl

    :roll:


    Woah buddy calm down there. I can see you get really sensitive when people bash ATI. The only thing here is that he's not doing any bashing. He is simply stating that an 8600 (150$) can beat a HD2900XT (400$). He has evidence to his claim as well, so please, calm down.

    Quote:


    I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10


    Umm, do you know that thus far everything that is done in DX10 can be done in DX9 (visuals-wise)?

    So you couldn't really say that it looks "almost entirely DX10" when DX10 looks no different than 9.
  11. So is this game going to be like the OpenGL titles (Doom 3, Quake 4) were for the previous series of nV cards? If that's the case, it's not a very good test bed to compare nV with ATI.
  12. wow that is so cool i am actually really happy and that would mean that the Gf8600GTS would do better. the GF 8600GTS is the type of card i could afford and it will go pretty good, im happy
    also im so board i am going to download the Dx9 demo, also what program can i use to measure FPS?
  13. Quote:
    I ran the performance test in the demo with the same config in the game as the legit review here: http://www.legitreviews.com/article/505/3/ to compare to the gtx and my result is : Snow: 32FPS , Cave 26FPS. Running at 1280x720. Comparing it to the HD2900XT its almost the same lawl. I will run it overclock my GT as the same clock speed as the GTS to check the difference. I play ingame and its not playable really with those fps, i tried withou the 4x Filter, its much better and i was able to play the game withou big lag. But still not really smooth and graphically not really good, but the physic and movement are really insane. I don't think so a 8600GT GTS will be okay to play dx10 with good graphic.


    id ask the same but the 8600 vs 2900XT in call of juarez...
    if the 8600 can even run it obviusly :P
  14. Oh well i know the 8600 wont perform better in call of juarez Even the 8800 have problem with the filter. My point was just showing how the 8600 was performing in the first dx10 demo. Yes its low resolution but i removed the 4x filter and i was at 40 fps and very playable in game.
  15. Quote:



    I don't know if you have actually "seen" the demo/bench. It uses a great deal of DX10 in it. The cave part looks almost entirely DX10


    Umm, do you know that thus far everything that is done in DX10 can be done in DX9 (visuals-wise)?

    So you couldn't really say that it looks "almost entirely DX10" when DX10 looks no different than 9.

    umm, no the caves scene uses procedural geometry creation for the cave rock walls (DX10), smart particles for the waterfalls and insect flocking with physics (DX10) and hyper realistic shading for shading and displacement (DX10).

    So really if you have not seen or run the demo, as I have than you would agree, it is mostly DX10.
  16. Just because ATI had the time to make good drivers doesn't mean they did. ATI drivers are pretty bad still, so I wouldn't judge the 2900 yet. At least wait for the 8.38 drivers. nVidia has a 7 month lead on drivers, and designed this game, so they should do better.
  17. i have an evga 8600gt(stock clock speed) and i just d/l the trial. only problem was that i had to d/l 2 .dll files, no biggie. i ran some benchmarks too:

    i left the settings default which is like 1284x720 resolution like the 1st post. i got around the same fps

    22 fps in the snow

    30 fps inside or somtin.

    intel core 2 duo e6300@1.86. 1 gig pc5300 ram, 320 hd@7200rpm, EVGA 256mb 8600gt 560/1400, windows vista home premium.

    i don't know what you guys are talking about but the game looks wonderful even when i put the detail on low. and it's very playable at 22 fps, i wasn't really paying attention doesn't really lag. wish i had a 8800gts so i would get around 40-60 but i guess i'm good with this card got it for $119.00
  18. My EVGA 8600GTS works fine with this game. (Nvidia driver 7.15.0011.5818).

    Sure, some of the settings have to be turned down, but the game still looks very good. This game doesn't seem to suffer much visually with lower settings. Maybe I'm not as demanding as some..... :wink:

    I downloaded the latest DX10 from Microsoft even though Vista has it already. A newer version of DX10 seemed to be needed.

    At default settings....
    Snow 31 FPS
    Cave 24 FPS

    Pentium D Dual Core 2.8 GHz
    2 Gig RAM
  19. Quote:
    OOOPS!! I was looking at the CoJ, sorry. I edited . Anyways good to see the 86 doing well. Any more?


    I clocked my 8600GT XXX to 690 mhz core and 835mhz memory, i can go up to 720mhz core and 860mhz memory but ill do it later.

    I did the Dx9 benchmark and Dx10 in game at 1280x720 with those setting:

    FPS = OFF
    AntiAlias = OFF
    HDR = Medium
    Texture Filter Anistropic = 4X
    Texture Resolution = High
    Model Quality = High
    Shadow Quality = Low
    Shadow Resolution = Default
    Motion Blur Quality = Low
    Effect Resolution = High
    Effect Quality = High
    Effect Volume = Low
    Lighting Quality = Medium
    Display Resolution = 1280x720
    Frequency = 60Hz
    Full Screen = ON
    Verticle Sync = OFF
    Aspect Correction = OFF
    Concurrent Operations = 2
    Concurrent Rendering = ON
    MultiCPU = OFF

    DX9: 32 FPS SNOW , 26 FPS CAVE
    DX10: 31 FPS SNOW , 26 FPS CAVE

    Conclusion is about the same fps in dx9 and dx10, also i feel there not much difference really between dx9 and dx10, i guess you need to put higher filter to see a difference. But still it beat the HD2900XT in dx10 for now :-)


    Uh oh. I'm running 2 7900GTX's in SLi in DX9 and getting about 35fps in the snow, and about 40 in the caves.


    Is that bad? :?
  20. SLI is not supported by the game yet.
  21. Oh, didn't know. My bad. :x


    I don't think the game even stated what resolution it was at either. I don't remember if their was or not, but I would have put it at 1600x1200 with all eye candy turned up. :D
  22. Hi,

    I've just ordered a new PC, and was wondering if I will be able to play DX10 games with it.

    The spec is...

    Intel Core 2 QUAD Q6600
    4GB DDR2 667Mhx RAM
    500GB HDD
    NVidia 8600GT
    Creative XFi Platinum

    Any info would be great, and is DX10 going to much better than DX9?
    I've seen things saying yes and no?

    Cheers

    Simon
  23. DX10 will most likely run slower on most titles over DX9 (However in Lost Planet it runs at about the same last time I checked), your PC however won't be able to play games with DX10 well due to your weak graphics card. This game in particular is very poorly optimized and requires tremendous power to run it at playable speeds with high settings, so be prepared to lower settings at around medium or maybe less to keep it playable.

    Also, try not to revive old topics like this next time :)
  24. Love how people talk about thinks they have absoulutly nno clue about my 8600gt oc'ed to 751 core and 950 (1900)mem. can play this demo dx10 almost all settings on high
    25-30 snow
    35+ cave
    those frame rate's are extremly playable unless you have mad optical sences.
  25. Well at what resolution? if you're playing at 800x600 or 1024x768 it doesn't count, I'm running at 1280x1024 with a 8800GTS and I don't get those frames with all on max.
  26. FYI i play at 1152x864 and i think games are plenty playable at that res. But i am also using a 14' in. diagonal viewable
Ask a new question

Read More

Graphics Cards Graphics