F.E.A.R performance

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Just wondering if anyone's done much benching with FEAR?

Does seem a very demanding game, but I wonder if my performance isn't a little on the low side all the same.

I'll check later when I get home, but from memory (using the built-in 'Test settings' option)

Min: 28FPS
Max: 88FPS
Avg: 49FPS
96% over 40FPS.

Rig (Basic overclock. Plenty more to come):
A643200+, ~@2.3Ghz
2x512Mb Crucial Ballistix PC4000 @ 230Mhz 1T 2.5-5-5-8
Asus V9999 6800GT 128Mb (@stock... for now)

Settings:
1152x864
Texture Detail medium (wishing I'd got a 256Mb+ card now.. :frown: )
Volumetric lights off
Lighting medium (or minimum.. can't remember)
shadows medium
soft shadows off
Everything else maximum I think.

I find the FPS annoying if I increase any of them, but I've not exhaustively tweaked yet. This is on a very new windows install, with 77.77WHQL drivers (at default settings)

---
<pre> (\_/)
|~~~~~|======
|_____| This was bunny. He was tasty.
/\/\/\/\</pre><p>
 

Coyote

Distinguished
Oct 1, 2003
1,007
0
19,280
I'm hearing that AA really hammers the fps, and you don't say what it's set at.

There's a big thread on bench results with F.E.A.R over at Anandtech if you're interested.

Mobile XP 2600+ (11X215)
Abit NF7-S v 2.0
Maxtor 60GB ATA 133 7200RPM
1 gig Corsair XMS PC3200
eVGA 6800GT
Enermax Noisetaker 420 watts
Win2K sp4
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Now that's weird... I just tried ocing the GT to ~390/770 (stock is 350/700) and it ran with no artifacts... but performance went <i>down</i> - got something like 35FPS average, min. of 18FPS.

I tried various levels of OC (including the 'detect optimal frequencies' ones of ~367/734) and they all make performance drop.

Do the GF6 series cards throttle back if overheating or something? I hadn't heard of that.

I've got the A64 @250x10 now, which has got my average up to 50FPS with settings otherwise the same as in the post above.

Don't suppose you have a link to the thread at anand do you? I had a quick look and found loads of FEAR-related discussions, but no obvious dedicated benchie thread.

---
<pre> (\_/)
|~~~~~|======
|_____| This was bunny. He was tasty.
/\/\/\/\</pre><p>
 

noncom

Distinguished
Oct 12, 2005
6
0
18,510
"I'm hearing that AA really hammers the fps,..."

I found going to 2x AA only reduces the framerate by a couple of points. Right now, my 5900(non-ultra) is doing just under 40fps "average" with the video settings on "minimum," except for the AA and resolution at 1024x768.

AA is pretty important to me, but I can't tell AF from Adam. Otherwise, the "minimum" video card settings look pretty darned good to me. The only problem is I couldn't see well at all while in the "ghost world." (The flashback things were ok, but not the weird fiery(?) place where the flying things attack you.)

BTW, I have a 2.5G Pentium CPU and 333 FSB, and I notice no significant difference between the "Computer" settings on "minimum" and "maximum."

<P ID="edit"><FONT SIZE=-1><EM>Edited by noncom on 10/22/05 07:36 AM.</EM></FONT></P>
 

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
Isnt the base clock for 6800gt 380/1000?

Signature (up to 200 characters). You may use <font color=blue><b>Markup</b></font color=blue> in your signature
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
I could accept it being crap for OCing, and giving me horrible artifacts everywhere, but just the performance dropping? Very odd. Seems like it's throttling itself or something, but I'd not heard of these cards doing that...

---
<pre> (\_/)
|~~~~~|======
|_____| This was bunny. He was tasty.
/\/\/\/\</pre><p>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Isnt the base clock for 6800gt 380/1000?
I belive it's 350/1000, so you're half-right.

Mine has memory specs of a vanilla 6800 it seems. But it was cheap (comparatively) and I needed a card in a hurry because I blew my 9800Pro up. I really hated the idea of splurging on another 9800Pro or similar, so I spend 50% or so more on this (which was a good 30% less than all the 'proper' 256Mb 6800GT cards at the time, IIRC)

---
<pre> (\_/)
|~~~~~|======
|_____| This was bunny. He was tasty.
/\/\/\/\</pre><p>
 

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
This game is poorly coded. Even high end cards like the GTX are having trouble with it.
shut up, you know nothing.

Signature (up to 200 characters). You may use <font color=blue><b>Markup</b></font color=blue> in your signature<P ID="edit"><FONT SIZE=-1><EM>Edited by coylter on 10/23/05 03:38 PM.</EM></FONT></P>
 

KCjoker

Distinguished
Jun 10, 2002
273
0
18,780
LOL, I know nothing? Well explain why high end cards are having trouble running the game in high resolution? If you have a good explanation I'm willing to listen.
 

Clob

Distinguished
Sep 7, 2003
1,317
0
19,280
Maybe becasuse this game brings new material to the table. They are not having problems with high resolutions, just enableing AA is the problem. That is because of the "soft shadows". Try doing some research.

<font color=red>"Battling Gimps and Dimbulbs HERE at THGC"</font color=red>

"<font color=blue> Wusy</font color=blue> <-Professional sheep banger"
 
G

Guest

Guest
I use to have this card. In this forum in the Video card OC section, look for a how-to that explain how to flash your card bios to "get" a 5950 ULTRA. Only difference will be the 128meg vs 256 meg.

You will get a good performance boost...

Asus P4P800DX, P4C 2.6ghz@3.25ghz, 2X512 OCZ PC4000 3-4-4-8, MSI 6800Ultra stock, 2X30gig Raid0
 

dhlucke

Polypheme
This game worries me a little bit. It is strangling video cards that are not even really out yet let alone the ones we have in our machines.

I don't want to get to a point where I can't play the newer games with eye candy on my monitor since it's too big.

I have yet to try it out but I have a feeling that at 1920x1200 with soft shadows on I'll be getting a frame per second on my x800xl.

<font color=red><b>Long live Dhanity and the minions scouring the depths of Wingdingium!</b>

XxxxX
(='.'=)
(")_(") Bow down before King Bunny
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
I have yet to try it out but I have a feeling that at 1920x1200 with soft shadows on I'll be getting a frame per second on my x800xl.
I would have to suggest forgetting the soft shadows. It really kills my 6800. I go from the average of ~50FPS (with low of 28) down to average 29FPS with a low of 17 by just enabling that (at 1152x864). Pretty big hit. I imagine it'd be a similar problem with your card, especially at that considerably higher res.

Depends on your preferences though. I was telling a friend how all the eye candy makes it suffer, and turned everything on but left the res alone. I thought it was horrible and jerky, but he just said 'Seems fine to me...' But then he has a gamecube, so perhaps he just needs educating.

---
<pre> (\_/)
|~~~~~|======
|_____| This was bunny. He was tasty.
/\/\/\/\</pre><p>
 

KCjoker

Distinguished
Jun 10, 2002
273
0
18,780
"Maybe becasuse this game brings new material to the table. They are not having problems with high resolutions, just enableing AA is the problem. That is because of the "soft shadows". Try doing some research."

If it's because of new material I don't see it. On my friends PC with a 7800GT it sure doesn't look that way. I mean yea it looks good but nothing that would make me think it's any different than COD2 Demo, or Q4. Those two look just as good and there's no problem with his rig running them.
 

Clob

Distinguished
Sep 7, 2003
1,317
0
19,280
It is called soft shadows. It's in the text that you quoted me on. try doing some research. Like I said.

BTW:
this is easy to do... Click FAQ on the left
<font color=red>"Battling Gimps and Dimbulbs HERE at THGC"</font color=red>

"<font color=blue> Wusy</font color=blue> <-Professional sheep banger"
<P ID="edit"><FONT SIZE=-1><EM>Edited by Clob on 10/24/05 08:05 PM.</EM></FONT></P>
 

KCjoker

Distinguished
Jun 10, 2002
273
0
18,780
I know about the soft shadows...my point is that the performance hit isn't worth the "effects" since Q4 and COD2 look just as good. Thanks for the FAQ help, wasn't sure how to do it on this site.
 

Clob

Distinguished
Sep 7, 2003
1,317
0
19,280
Softshadows can be turned off... I dont see the problem...

<font color=red>"Battling Gimps and Dimbulbs HERE at THGC"</font color=red>

"<font color=blue> Wusy</font color=blue> <-Professional sheep banger"
 

addiarmadar

Distinguished
May 26, 2003
2,558
0
20,780
I could accept it being crap for OCing, and giving me horrible artifacts everywhere, but just the performance dropping? Very odd. Seems like it's throttling itself or something, but I'd not heard of these cards doing that...

These Nvidia cards will throttle down if its overheating. Normally controlled by the driver. However, OCing can always produce the inverse effect. It could be working at those speeds but making mistakes at those speeds, thus dropping the FPS. This could be either caused by heat or not enough juice. Nvidia chips are more prone to these OC side effects than ATIs. All my radeons always gives me the snow before I see anything else.

<i><font color=red>Only an overclocker can make a computer into a convectional oven.</i></font color=red>