Core 2 Duo E6600 2.4GHz @ 3.1GHz
2gb Kingston HyperX Ram
Western Digital Raptor
OCZ 600W GameXstream PSU
BFG 8800GTX 768mb OC edition
Okay, I've bought my current setup last January and had no problems besides my first Asus Striker Extreme was a bit faulty and would choose when it would want my system to post + boot. That's been fixed.
Recently, in the past week I've OC'd my E6600 since I've been getting bad-ish framerates in Crysis. I know it's a very intense game. A friend of mine currently sports an 8800GTS 6XXmb and a slower processor + not as good motherboard (forget his model, but he told me =P) and gets better fps than me with higher settings set in Crysis than I do. The only upper end in his system is 4GB Ram, big deal. I have to run it at 1024x768, everything on very high besides shadows/shaders which pretty much have to be on medium. No AA. I get 18fps on average. He currently runs the game at 1280x1024, everything on very high with x4 AA and it runs a lot better.
Now, this is where my curiousity began. I did some research. I currently have the new Crysis beta drivers NVidia has on their website. I put these on after trying the newest forceware drivers. There isn't any driver issues. I recently did the latest BIOS update on my motherboard as well, version 1305. My card is set at x16 PCI-E in the PCI-E slot... BIOS isn't an issue...
I have Everest Ultimate Edition and stumbled across something VERY bizarre.
My clock speeds are reading lower than stock speeds. Now I'm pissed. No wonder my vista experience index is also reading 5.6 for my 8800GTX when I searched google to find that everyone else's reads 5.9? And if anyone knows... 0.# is actually a big difference in the vista index =/.
So, my journey leads me to call BFG Technology, whom is my videocard manufacturer. The representative was stumped and could only think it might be my power supply. Their 8800 GTX OC edition requires 32A total between the two rails (cables) that connect to your card. I forgot the model of my PSU while on the phone, and we both agreed that was most likely the problem. That the card wasn't getting enough juice. After I got off the phone I found my model was the 600W GameXStream OCZ PSU and it has 18A per 12V rail. Two 12V rails are connected to the card giving me a total of 36A, +4 more than required. Perfect...
I was thinking maybe I could set the stock clock speeds with an overclocking utility... If it was somehow my PSU my system would obviously become unstable due to lack of power over the components. This would surely narrow it down. I tryed downloading RivaTuner 2.5, but only to have issues with disabling driver signing. (64bit Vista...). I downloaded Xpertool only to not even have the "Performance" tab in the taskbar's icon. Wierd. Anyways, I just said heck with it I'll try NTune. Downloaded NTune, and did the following. ---> Performance ---> Adjust GPU Settings
...The option highlighted read "Factory Shipped Clock Frequencies". I clicked on the "Custom clock frequencies" option. And began to move the sliders. They aren't moving! Somehow for some reason they are stuck where they are at. I can adjust the fan setting manually, but not the clock settings... Maybe it's another 64-Bit Vista problem.
Well, I'm honestly really confused and would wonder if anyone else had similar problems? I could use all the help I can get. Running the card at stock speeds isn't too much to ask for especially when you originally bought it for $800... A month's worth of rent.
I have the same card and mobo as you do and mine runs Vista Experience at 5.9 like you said. Two of my friends bought the same card after I did because I liked it so much. Both get 5.9. I think you are going in the right direction, PSU may be having an issue. Do you have another PSU to swap and check against or a friends system you could try your video card in and see what the results are?
Second 18A + 18A does not always = 36A in psu maths. Your PSU has 4x 12V rails each at 18A (do not confuse rails with leads), however you have 580W across all 12V and 5V and 3.3V rails, this equals 580/12 = 43A. if you are using the maximum of your 5V and 3.3V rails you'll have approx 36A (co-incidentally)
Third, what scores are you getting in something that actually tests the card, i.e. 3dmark etc. should be approx 10,000 for 06 and 14,000 for 05 (ballpark anyway). I know that my old 7900GTO used to lower it clockspeeds when not under load, not sure if my 8800GTX does or not.
It could be a 64bit issue, you do have 64bit drivers?
What temps are you running at under load, CPU and GPU? This could be really important.
November 1, 2007 6:15:54 PM
f8 when booting 'Disable driver signature verification'
f8 when booting 'Disable driver signature verification'
what does that do?
November 1, 2007 6:27:29 PM
I tryed downloading RivaTuner 2.5, but only to have issues with disabling driver signing. (64bit Vista...). !
it lets programs like rivatuner or coretemp run on your vista machine. if you dont do this before you'll get endless error messages about something not starting up right (atleast with coretemp) and wont stop until u ctrlaltdel. wont fix his video card but maybe getting into riva will?
Riva and core temp might be key to fixing this. I've notused vista so I had no idea what he was talking about with regards to vista not allowing him to install. I assume this is the security features in Vista.
13th Monkey, WRX, nice! I'm currently downloading 3dMark 06 and I'll post the scores. They should be up in an hour and a half, we'll see what we get and can continue from there.
kpo6969, That's what I was beginning to think. That the card is adjusting performance levels based on how much power it has available. You would honestly think that Vista would at least warn me, saying something along the lines of "insufficient power for your GPU -adjusting performance levels-"...
We'll get this figured out. I did so many google searches and haven't found someone with my identical problem. First time for everything I guess. I hope this thread ends up helping others who ever get this problem as well.
Wow. 3d Mark 2006 Basic Edition on defaults gave me a score of 5433. Could be these crysis beta drivers though... But still... I'll get the forceware and test once more. Here are the temperatures on load.
Damn I was hoping it was temp related. PSU should be ok, and you should have enough power.
How are you loading the CPU cores? Can you use TAT? (thermal analysis tool from intel)
Late here now (UK), will think over tomorrow.
cpu-z would be good, it could be as simple as your gpu being underclocked, and needs setting to std settings again, but why won't it? some kind of safe mode? I'm rambling (post pub) perhaps someone can take an idea and run with it.
how about gpu-z the gpu version of cpu-z (obviously) its new, but might tell you something.
Get riva up and running and I can compare domain speeds against my GTX tomorrow if you want.
can you get the breakdown of CPU. SM3 and SM2 scores (or whatever 06 breaks it into?)
I've just had a daft thought... doesn't vista assign an amount of RAM to match the VRAM (it was a fault reported on anandtech which ms was working on). hence you would really only be running with 1.25Gb of ram for general use? may not be the answer but a consideration.
Does it perform like this with non-beta drivers? are you using 64bit drivers?