In the first level of Crysis things are a bit choppy when my FPS is 60+. With the setting below during the GPU benchmark I get minumum FPS 33 & average FPS 65. Currently have 24 hours Orthos & Prime 95 stress tested, very stable system. My CPU benchmark is almost identical to my GPU benchmark around 50+FPS in CPU benchmark. My monitor is very fast response times, never had ghosting problems in the two years I've had it. Can someone tell me in the right order which in-game graphics setting are the most taxing?
i dunno, you shouldn't be having this problem with your setup. Does it seem to you like the type of choppiness that might be associated with a lack of ram? (sudden pauses then sudden return to normal frame rate)
haha, is that a crysis screenshot you've got for a background too??
Damn, how I don't know but one of my PCI-E cables has a short in it. What I do when strange behavior happens is wiggle all my power & data cables and when I wiggled one of my PCI-E cables the video starts to wigg out a little with stange colors. AAAhh time to RMA a PCI-E cable, rare but it does happen. Hope I didn't hurt my 8800GTX.
Now Crysis runs butter smooth using one of my other PCI-E cables. Thanks everyone for your replies.
i get choppy fps on the videos too but that is strange because yr 3d mark score is higher then mine and i still play on higher settings then i do with about the same fps
With Vsync turned off I see 95 FPS with an average of 70+ FPS, but I get image tearing which is a bummer. I wonder when Nvidia will start writing in their graphics drivers for tripple buffering instead of double buffering? I was able the enable this on my 7800GTX 256MB card, without the tripple buffering my FPS in most new games was capped at 39 FPS but with the DXtweaker & NvTray was able to jump past my capped 39 to past 50 FPS.
That tweak doesn't work on 8800 series cards, wonder if Nvidia caught wind about people using it. That tweaker allowed me to put off my upgrade for a hole year, that would hurting Nvidia's sales of all their cards if everyone was using it. Bet thats why Nvidia doesn't bother doing it in their drivers.