Sign in with
Sign up | Sign in
Your question

can 8800gts play dx10 game at max settings?

Last response: in Graphics & Displays
Share
May 19, 2007 3:31:26 PM

hey, i have an 8800gts 320mb at 1440x900 res and i was wondering if i could play the dx10 games that are soon coming out like crysis and lost planet at max quality. thx
May 19, 2007 3:59:31 PM

well from what we have seen. no, look at the thread that talks about the Lost planet benches, most GF 8800GTS's are playing with most on high but not fully MAXED out, as for crysis no one know yet, sadly :( 
we will have to wait and see. maybe its only the drivers or maybe it is actually the hard ware
May 19, 2007 4:05:20 PM

The only current "Direct X10" title that's playable is Lost Planet, and it actually runs better under Direct X9 at the same graphics quality setting. We won't really know about Crysis until it comes out.
Related resources
Anonymous
May 19, 2007 5:01:10 PM

i tryed that lost planet game i got only 7 to 8 framerates from my evga 88000gts 320 superclocked. werid thing was my CPU usage was 30 to 35 % only.
May 19, 2007 5:23:49 PM

what resoloution did u hav it on?
May 19, 2007 5:28:52 PM

Yes you can :D  i know you can :lol:  i swear you can :wink: i have a gts and it rocks ,its got unleashed potential hehehe
Anonymous
May 19, 2007 5:35:37 PM

1280x1024. that game made my 8800gts look like peice of garbage. but i used 158.18 drivers.
May 19, 2007 6:47:45 PM

Lost Planet peformance test:

Not maxed out but most eye candy turned on:

1600 x 1000 snow 30 cave 44
1280 x 720 snow 45 cave 56
1280 x 720 (oc 614/893) snow 56 cave 61
1280 x 960 (oc 617 x 921) snow 45 cave 61


e6600 @ 3.45
2 gig ram
vista dx 10
gts 8800 (640)
May 19, 2007 7:19:44 PM

Makes you wonder what the 8600 (and below) series cards are much good for :roll:
May 19, 2007 7:20:49 PM

BFG 8800GTS 320 OC2 (580/1700)
AMD 4200+ (939) @ 2.9
AsRock 939Dual-Sata2 (the most amazing hardware I ever saw)
1.5 GB DDR400 Single-Channel (yeah, I know, will change this week)
Asus VW192 19" 1440*900

No AA - AF Trilinear - Everything HIGH except Shadows MEDIUM

XP DX 9.0c
Snow : 55
Cave : 29

Vista Ultimate 32-bit DX 10
Snow : 50
Cave : 26

Seriously, the 8800 GTS 320 is astonishing for the money, and value always must be considered when thinking about the 320.

And there is a difference for me in the DX10 version. I find it much more immersive, and generally smoother. I will take the fps hit every time.

Hope it helps...
Anonymous
May 19, 2007 8:09:37 PM

there is somthing wrong with my PC i should try that again why i got so low framerates. i have Pentium D 3.00 GHZ. 3 GB Dual channel 667 MHZ memory. with EVGA 8800GTS 320 Superclocked with windows vista Ultimate
May 19, 2007 8:26:13 PM

Erm, well.

Make sure you have the latest DX 10 redistriubutable through windows update.

Try re-installing the video driver. Or try the lastest beta for 8800GTS/Vista Ultimate 32 : http://www.nvidia.com/object/winvista_x86_158.43.html

Hope it helps you.
Anonymous
May 19, 2007 9:03:00 PM

yep 1.5gb per channel. i have same sticks kingston 667MHZ 2x1GB and kingston 2x512MB and it's intel G965wh motherboard. i just added 2x512 stick 2 days ago. i ran lost planet test on 2 gb dual channel. i think i should run it again on 3 GB but i don't think it would help.
Anonymous
May 19, 2007 9:15:14 PM

i mean it's still pentium D lol not celeron D the framerates i got. CPU usage as i said was just 30 to 35 % during game thats strange
May 19, 2007 10:16:51 PM

Quote:
isn't there only a 90 day window for that, i don't think crysis will be here by then.
Yeah it's only 90 days; I missed my chance to step up, but I wasn't about to pay another $250 for an overclocked 8800GTX.
Anonymous
May 19, 2007 10:19:58 PM

yeah negative thing about stepup is the prices on EVGA website are just to high.
May 19, 2007 10:45:13 PM

Quote:
i tryed that lost planet game i got only 7 to 8 framerates from my evga 88000gts 320 superclocked. werid thing was my CPU usage was 30 to 35 % only.


Something is seriously screwed up with your rig then.
May 20, 2007 8:15:26 AM

what i havent seen is the results from a GF8800Ultras that would be nice :) 
May 20, 2007 9:21:29 AM

Quote:
Makes you wonder what the 8600 (and below) series cards are much good for :roll:


The 8600 cards are a cut above the 7950s. Peace :twisted:
May 20, 2007 11:33:16 AM

If a 2007 $300+ dollar GPU can't play a 2007 release game acceptably then...

Well, I guess that would just suck.

If what everyone is saying (and writing) is true Crysis better be the Half Life of the 21st century.
May 20, 2007 11:36:48 AM

From what I've seen in practical game-tests, the 8600-cards hardly perform better than 7600s - the massive reduction in stream processors from the 8800-cards cripples it, although I suppose it'd be useful in a HTPC because of the new hardware decoder chip :) 
May 20, 2007 12:37:06 PM

Quote:
what i havent seen is the results from a GF8800Ultras that would be nice :) 


You're so dumb.. :roll:
May 20, 2007 1:03:51 PM

Quote:
what i havent seen is the results from a GF8800Ultras that would be nice :) 


You're so dumb.. :roll:
why
May 20, 2007 1:09:45 PM

In a word no.

We've seen some unomptimised Crysis benches where 2 x 8800GTX in SLI were at 16fps (allegedly).

Note the word unoptimised.

However, whilst the final game will have much better frame rates, and play on more lower spec cards at both DX10 and DX9, there are still effects in DX10 that take even the most powerful DX10 cards to the limit. eg. In Alan Wake, the tornados allegedly had to be left out becuase it took a quad core and 2 x 8800gtx in sli to render them - it was in one of the tech videos. In Crysis the developers have said more features will be added in the future hinting that not everything can be activated at release.

At the moment if buying a card buy the most pwoerful you can afford but be aware that you might not get maxxed.

If you can wait, wait for the next generation of DX10.1 shader model 5 cards which should be out later this year as a die shrink as well as a redesign may give more fps.
Anonymous
May 20, 2007 1:38:59 PM

yeah i haven't seen any 2900xt results. oh wait it has driver issue lol. yeah right.
May 20, 2007 4:38:46 PM

Quote:
In Alan Wake, the tornados allegedly had to be left out becuase it took a quad core and 2 x 8800gtx in sli to render them - it was in one of the tech videos.


You are wrong about that. Those videos you speak of as I remember watching them was stated that it was being powered by a quad core and 2 7900GTX's in SLI running under DX9 when it rendered those tornadoes.
!