Sign in with
Sign up | Sign in
Your question

Crysis Benchmark: 8800gts g92 = 36fps, 8800gts g92 x2 sli = 35 fps???

Tags:
  • Graphics Cards
  • Benchmark
  • SLI
  • FPS
  • Crysis
  • Graphics
Last response: in Graphics & Displays
Share
April 7, 2008 9:06:36 PM

At 1920x1200 all on medium dx10 and 64 bit, Crysis Benchmark: 8800gts g92 = 36fps, 8800gts g92x2 sli = 35 fps??? WTF! My system is below. Shouldnt sli give better performance??? OR whats the point? Im not sure what is wrong. I have all the most current drivers from 1.1 to 1.2 to 1.2.1. I also have vista sp1. In the nvidia control panel I enabled sli but I dont see any benefits accept in 3dmarks. Could this be caused by a psu problem? I have an 850W antec made for sli. Also when I overclock my fps seem to go down. Could someone please help me. PLEASE, im at my wits end with this build, im about to throw it out the window.

More about : crysis benchmark 8800gts g92 36fps 8800gts g92 sli fps

April 7, 2008 9:27:17 PM

Upgrade drivers probably? 169.21 latest drivers WHQL
April 7, 2008 9:31:50 PM

I have the latest gpu drivers. Also, I have tried both cards separately and they both work. Please, does anyone have any ideas?
Related resources
April 7, 2008 9:44:21 PM

Nothing?? Im sad! :( 
April 7, 2008 9:52:56 PM

Do u see performance gains in any other games at all? and how much of an improvement in 3dmark r u actually getting?
April 7, 2008 9:56:20 PM

Wat performace gain in other games ? If still as single card, check if sli is properly enabled in bios or flipped pcb switch between two pci-e to ensure sli ?
a b U Graphics card
April 7, 2008 10:02:28 PM

Very basic question, but sometimes we miss the obvious: if you run GPU-Z, does it report "NVidia SLI" enabled?
April 7, 2008 10:16:52 PM

Ok, Gpu-z shows that sli is enabled. In other games I get about 0-5 fps better then with no sli. On 3dmarks I get 11250 with one gpu and 12400 with two gpus in sli. Any ideas? Thanks for the help so far. I dont understand. Maybe sli just sucks for newer games and or cards.
a b U Graphics card
April 7, 2008 10:20:13 PM

When you say "latest drivers" do you mean the 169.25 or the 174.74?
a b U Graphics card
April 7, 2008 10:24:43 PM

While I think of it, have you tried running the 32 bits version? I came across this thread and even if its old, it might be worth a try.
April 7, 2008 10:28:21 PM

Hi Ryanthesav.

Sounds like you may need to tweak the Nvidia control panel settings a little.

Under Nvidia control panel, go to manage 3d settings and then program settings. Find crysis under "select a program to override." Find SLI performance mode. Try changing that setting to "Force Alternate Frame Rendering 1 or Force Alternate Frame Rendering 2." I've found that AFR mode 2 seems to work for me the best. Also at that resolution you will want to make sure antialiasing is set to "application controlled." Shut it off in the game settings. You will also want to make sure your anisotrophic filtering is set no higher than 4 or 8.
April 7, 2008 10:54:56 PM

Sli is over rated. You bought into marketing hype.
April 7, 2008 11:28:37 PM

jpmeaney thank you thank you thank you I gained about 10 to 15 fps on average changing the sli performance mode. I wish there were more people like you on this forum. Thank you again.
April 7, 2008 11:38:28 PM

roadrunner197069 said:
Sli is over rated. You bought into marketing hype.


Sssh, if it wasn't for morons like the OP with more money than sense nVidia would not be able to create the excellent single GPU solutions at cheap prices.

I say let them buy SLI, one card, two cards, THREE cards, <lightning flash and thunder clap> A HAHAHA!
April 7, 2008 11:39:35 PM

^ Does it look better? Or play better?

@ doomsdaydave11: Every game has its magical FPS number and after that number is met the rest is waste. You eye cant tell the difference.
April 8, 2008 12:04:36 AM

It looks the same and just has better fps. Parts that I used to get 35 fps I now get about 55fps and parts that I would get 20fps I now get about 30fps. I don't think many people know about this tweak, at least I didn't. Bf2gameplaya, this second card was from my tax refund; I'm not rich, take it easy bro.
April 8, 2008 12:15:31 AM

I've had really good SLI experiences (7600GT and 8800GTS 640mb). Over hyped? Most definitely. Worth it? Up for debate.
April 8, 2008 12:32:33 AM

I think you're all wrong--I think he is bottlenecked by his CPU.

April 8, 2008 12:52:44 AM

I think in many ways ur right hesskia. Im waiting for the phenoms to be overclockable to 3ghz and then ill go quad.
April 8, 2008 7:56:55 AM

Glad ya got it sorted ryan, i dont think sli is a waste, you can acheive good results with it. Now ur getting higher FPS you can add a lil xtra eye candy for your gaming pleasure ;) 

If only i had a larger monitor to make it worth my while getting another 8800GTS 512
a b U Graphics card
April 8, 2008 8:27:40 AM

doomsdaydave11 said:
On the GPU charts @ Tom's Hardware... they really need to add Crysis. It's the game that everyone benchmarks by.

For single cards, yea, but for SLI/CF it isn't worth benching. Crysis has horrendous scaling.
a b U Graphics card
April 8, 2008 10:01:49 AM

SLI/CF is da bomb, its all this quad core that you can barely use thats overrated. In my opinion, , SLI or Crossfire is much more productive than a quad cpu
April 8, 2008 11:01:08 AM

hesskia said:
I think you're all wrong--I think he is bottlenecked by his CPU.

Absolutely not, not even close!
Other than 3DMark, you wouldn't be able to tell the difference in CPU with the games I play.
SLi does give a lot more, but as it's generally in not massively noticeable areas, it can go somewhat unseen.
With CPU's where theey are now, we aren't even close to being bottle-necked by a lowe-end dual core, let alone a 6000+. Yes I know, it's not as good as Intel CPU's, but you're never going to notice the difference in game. I get upto 350FPS in CoD4 with full everything @ 1280x1024, hardly CPU limited am I? :D 
April 8, 2008 1:05:10 PM

1280x1024...makes me wonder why didn't i take the LG 1900 Fantasy series...:) 
a b U Graphics card
April 8, 2008 2:45:33 PM

roadrunner197069 said:
Sli is over rated. You bought into marketing hype.

Overated? Here we go again. :sarcastic: 

SLI is the way to play crysis. Have a look at my screenie. Is 29 fps for SLI vs 16 fps for a single 8800GT overrated?
http://img101.imageshack.us/img101/287/crysis8800gt1vs2...


April 8, 2008 6:24:34 PM

"you wouldn't be able to tell the difference in CPU with the games I play" This is true if you are not playing Crysis, which from what I heard uses all four cores if given. I heard it uses the cpu for physics calcs and lets the gpu worry about other stuff.
April 8, 2008 6:35:58 PM

"Overated? Here we go again. :sarcastic: 

SLI is the way to play crysis. Have a look at my screenie. Is 29 fps for SLI vs 16 fps for a single 8800GT overrated?
http://img101.imageshack.us/img101 [...] vs2ee2.jpg "

Its the same for Lost Planet, bioshock and many other power hungry games. I don't see why everyone hates sli. I think its because of the memory that it seems like such a waste. Two cards and still only 512 of memory. In some ways I see why people are upset.
a c 273 U Graphics card
April 8, 2008 9:20:57 PM

ryanthesav said:
I don't see why everyone hates sli.


Jealousy is a cruel mistress. If they can't afford, then they have to hate. :lol: 
April 8, 2008 9:26:00 PM

pauldh said:
Overated? Here we go again. :sarcastic: 

SLI is the way to play crysis. Have a look at my screenie. Is 29 fps for SLI vs 16 fps for a single 8800GT overrated?
http://img101.imageshack.us/img101/287/crysis8800gt1vs2...


That image looks like you are running at a low resolution with zero anti-aliasing. So you are excited about getting 29FPS with two video cards in low resolution with low image quality?

I wish I had such low standards. So does nvidia.
a b U Graphics card
April 8, 2008 9:31:30 PM

bf2gameplaya said:
That image looks like you are running at a low resolution with zero anti-aliasing. So you are excited about getting 29FPS with two video cards in low resolution with low image quality?

I wish I had such low standards. So does nvidia.

[:turpit:2] What a good judge of settings. It is 1680x1050 all high details(max in-game windows XP) with 2xaa and 16xaf. You were so close :lol: 

Plus it's one of the most GPU demanding levels of Crysis. I invite you to try those settings on any single GPU in the Paradise Lost level.
a b U Graphics card
April 8, 2008 10:38:49 PM

Some people rave about getting a quad core, then trash multi gpu setups. Theres maybe 10 apps people actually use that sees the scaling that a multi gpu can do, plus theres lots more than ten for the gpu. Im thinking its more than jealousy here, its a lack of understanding. People wail on about getting a quad, which can come close to getting another gpu in price, only due to the excellent overclocking on even the cheap cpus. But people will swear by them, and yet, if you really look at what kind of true performance you get, its minimal compared to a SLI/CF setup. Why this is so excepted is beyond me, other than fanboyism. Im saying this as it applies to gaming, so keep that in mind. Theres alot more games that benefit from SLI/CF than a quad core
April 8, 2008 11:10:07 PM

Mousemonkey said:
Jealousy is a cruel mistress. If they can't afford, then they have to hate. :lol: 

I agree...dont knock it till you actually have have it!!!
a b U Graphics card
April 8, 2008 11:40:41 PM

bf2gameplaya said:
That image looks like you are running at a low resolution with zero anti-aliasing. So you are excited about getting 29FPS with two video cards in low resolution with low image quality?

I wish I had such low standards. So does nvidia.

Since when did "low" have shadows?
April 9, 2008 12:01:16 AM

I have almost the exact same setup, and I have the exact same problem, except I'm at 1920x1080, I see little to no difference with sli on or off in Crysis, I'll have to check out the fix of playing with the sli performance modes. I did update to the beta 174.74 drivers, and that gave me a bit of a boost, but its still not where I'd like it to be. as for is it worth it, in most games, yes. I used to have a single 8800GTS 640, and that was starting to struggle at 1920x1080, not to mention it was running crazy hot.

bf2gameplaya- learn to click on the image to see the unscaled version of it. I suspect you looked at a scaled version of the screenshot and failed to notice the fact that it was about 1/2 is original size.
April 9, 2008 12:20:18 AM

E36_Jeff, there are four modes that you can choose in the sli performance mode. There are Force Alternate Frame Rendering 1, Force Alternate Frame Rendering 2 and split frame rendering; some work better then others depending on the game. The funny thing that nvidia dosnt tell you, for some reason, is that you change this setting to one of the three from a setting called single gpu?? WTF!? Why would you ever want a sli setup while only using one single gpu? This changed everything for me and will for you too. Have fun and happy gaming.
a b U Graphics card
April 9, 2008 1:37:09 AM

The 169 drivers had a bug reporting sli mode in the game profiles....just said single gpu even if it was afr, etc. But the 171.23 betas (and newer I believe) do report the correct sli mode used.

With the 169.xx drivers these games actually do use single card mode:
Games with Single GPU SLI Specific Profile:
Sims 2 – all versions
Hitman Contracts
Thief 3
Clear Sailing
Fire Truck
City of Hero’s
FIFA Soccer - All Versions
NASCAR SimRacing
Grand Theft Auto: San Andreas
Boiling point – Road to Hell
Dungeon Lords
Tiger Woods – all versions
NHL – all versions
Neuro
Heroes of might & magic 5
Blitzkrieg II
Draken: Order of the Flame
Cossacks II
Heroes of the Annihilated Empires
Medievil : Total War
Arx Fatalis
Juiced
Priston Tale
Maelstrom
Instinct
Myth War
Voyage Century Online
HuangE Online
Infernal
Dragon Throne & Expansion
Chinese Paladin 4

But all the others (vast majority) have SLI enabled despite saying single card.
a c 273 U Graphics card
April 9, 2008 4:21:40 PM

ryanthesav said:
E36_Jeff, there are four modes that you can choose in the sli performance mode. There are Force Alternate Frame Rendering 1, Force Alternate Frame Rendering 2 and split frame rendering; some work better then others depending on the game. The funny thing that nvidia dosnt tell you, for some reason, is that you change this setting to one of the three from a setting called single gpu?? WTF!? Why would you ever want a sli setup while only using one single gpu? This changed everything for me and will for you too. Have fun and happy gaming.

Whilst I'm not 100% sure on this I think the single GPU setting just means that the two cards get 'seen' as one single card and the rendering mode is left to whatever the application is coded for.
April 9, 2008 6:45:52 PM

Im still so happy about this. I just got 13000 on 3dmarks! When I get the 9850 i hope to get close to 15000.
a c 273 U Graphics card
April 9, 2008 6:54:08 PM

ryanthesav said:
Im still so happy about this. I just got 13000 on 3dmarks! When I get the 9850 i hope to get close to 15000.

[:mousemonkey:2] I reckon that you will get high 16K's to low 17K's as 15K can be got with a 6850 @ 3.4ghz and a pair of non overclocked 8800GT's.
April 9, 2008 7:31:03 PM

ryanthesav said:
Im still so happy about this. I just got 13000 on 3dmarks! When I get the 9850 i hope to get close to 15000.

Thats pretty impressive, my system only hits 13k and I should beat you! ;) 
Thats under XP Pro as well...
a c 273 U Graphics card
April 9, 2008 7:57:51 PM

LukeBird said:
Thats pretty impressive, my system only hits 13k and I should beat you! ;) 
Thats under XP Pro as well...

Try giving that CPU of yours a tweak if you want to see higher scores in 3d '06.
April 9, 2008 7:58:20 PM

My proc is about the same performance as yours OP (according to TH), its a C2De6600 and I was CPU bottlenecked until I overclocked it. I too have a pair of 8800GTs, and the main proc makes a huge difference. After a 25% CPU OC the second 8800GT now gets a lot more action. My 3dMark06 scores went from low 11s to low 14s, with non-OC'd 8800GTs. Whoever says you're not CPU bottlenecked is guessing, and getting it wrong. If you can OC your AMD 6000+ a little more....
April 9, 2008 8:05:18 PM

I agree with ononewl on the CPU bottleneck. You can also try testing your CPU at a lower speed to see how much more it bottlenecks you.
April 10, 2008 9:27:53 AM

Mousemonkey said:
Try giving that CPU of yours a tweak if you want to see higher scores in 3d '06.

Yeah I've thought about it, just don't think it's worth the effort! :D 
ononewl, I agree with you to an extent (it was me that said you wouldn't be CPU bottlenecked with a 6000+), but you're only going to be bottlenecked in 3DMark. I am in no way really CPU limited in game... ;) 
April 14, 2008 10:28:38 PM

The SLi memory issue is all a matter of persecive I would say. Correct me if Im wrong, but each card uses its own memory right, so although you *ONLY* have 512MB of video RAM available, your still using 1GB, 512 to each GPU.

If I have that totaly wrong, feel free to correct me.
April 21, 2008 11:45:54 PM

GPU's are already extremely highly parallelized. How can someone reasonably think games can't scale well to multiple GPU's?

The SLi memory issue is somewhat related to memory bandwidth. Take alternate frame rendering, for example. The same graphics data still has to travel across the bus to the GPU's. Each GPU essentially has more time (double) to process the data and send it back, but the data transferring doesn't get any easier. It is entirely possible that the system simply can't pump that data across the necessary buses fast enough to keep up with the GPU's.

Also the memory between the two GPU's isn't shared, so tasks essentially have to be virtually independent. This leads to a lot of repetitive data being stored on both GPU's, so it is certainly different than having double the memory on one GPU. A single GPU with double the memory and clock speeds would obviously be preferable, but certainly much more difficult (and maybe impossible) to do.

I think SLi is great. It's not only nice for the bleeding-edge buyers who want the only thing better than the fastest card available (two of them), but it's also nice for every consumer in general. Graphics rendering, and many other general computations, are highly parallel and can scale very nicely with multiple GPU's. There are lots of games that play better on two 9600GT's in SLi than on a single 8800Ultra. And guess which setup is cheaper, both in market price and production cost?

That's not to say there aren't issues. Driver's still obviously need attention, and more flexibility would be nice, similar to CrossFireX. Also I don't really like the narrow motherboard choices.
!