Sign in with
Sign up | Sign in
Your question
Closed

Desperate for Help

Tags:
  • Nvidia
  • Graphics
Last response: in Graphics & Displays
Share
October 23, 2011 7:37:50 PM

First time poster, long time visitor.

Hey guys. I've searched everywhere on the internet, asked everyone I knew, tried everything possible, and I can't get it to work. My issue is GTX 560 TI SLI. On the NVIDIA Control Panel, on the "Set SLI and PhysX Configuration" section, Under the SLI Configuration menu I have two options. "Maximize 3D Performance" and "Disable SLI." Every single time I select the radio button for "Maximize 3D Performance" my output crashes. Either the monitor will no longer receive signal from PC and I will have to do shut-down, the driver will crash and the resolution will go to 800x600, or the screen will go black for a minute and when it recovers, both VGAs are not recognized in Windows, Speccy, Sandra, and the Windows Device Manager lists the cards as not functioning, driver missing, and several other malfunctions. Here is the report from the Device Manager on one of the cards after a crash:

Problem Event Name: APPCRASH
Application Name: nvcplUI.exe
Application Version: 3.8.812.0
Application Timestamp: 4e390858
Fault Module Name: NVCPL.DLL
Fault Module Version: 8.17.12.8026
Fault Module Timestamp: 4e3912ac
Exception Code: c0000005
Exception Offset: 000000000031bea7
OS Version: 6.1.7601.2.1.0.256.1
Locale ID: 1033
Additional Information 1: ae71
Additional Information 2: ae71205663998323bddbf609ff81eeef
Additional Information 3: 4214
Additional Information 4: 4214f84d53e74ea0622c7110f47bfc42

I don't know if that will help anyone, but I want to give you guys the most information I can. I am a software tester by trade, so I understand the concept of Trouble Reporting. Here is my system:

ASUS P8Z68-V PRO Motherboard
Corsair 750W Gold-Certified PSU
i5 2500k (Usually run at 4.0GHz, Tried to enable SLI at 3.3GHz, 3.7GHz, all the way to 5.25GHz and everything in between...I've never went higher than 5.3GHz yet)
8GB DDR3 PC12800 G-Skill Sniper 1.25 Volt RAM 2X4GB 9-9-9-24 (Running XMP at 1600, but tried at 1333 too)
Mushkin Chronos 120GB SSD on SATA Gen III 6.0Gb/sec. Running OS. (Currently 19.9GB Used, 91.7GB Free)
OZC Agility III 60GB SSD on SATA Gen III 6.0Gb/sec (55.8GB Free, Drive is not used currently. Empty)
1TB 7200RPM HDD SATA Gen III 6.0Gb/sec (903GB Free of 931) -I just use this for storage
1TB USB 3.0 Seagate External HDD
Generic Optical Drive on SATA Gen II 3.0Gb/sec
12 in 1 Card Reader (not used)
CoolerMaster HAF 912 Case
ASE Tech 510LC 120MM Radiator and Fan
Killer 2100 Network Card
PNY GTX 560 Ti OC

A Few Benchmarks:

-With that setup, for WEI I had a 7.7 for CPU and 7.9 for everything else. (Note: CPU stock is 7.5, no matter how high I overclock it won't go past 7.7. And the PNY GTX 560 Ti OC was a 7.8/7.8 until I overclocked to 900MHz or so, up from 850.) I know WEI doesn't matter but since I'm going in depth I might as well go all out.
-The PassMark Performance Test 7.0 from passmark.com is as follows-
CPU: 10109.1 points
2D Graphics: 661.4 points
3D Graphics: 2697.4 points
Memory: 3295.1 points
Disk: 3823.1 points
Overall System PassMark Rating: 3859.7
-3DMARK05: 37356
-According to SI Sandra, my system is rank 920 in their samples, and in the 94th percentile

Now, I know you guys don't need all that information, but there it is anyways. I guess it shows I really care about this rig. Anyways, three days ago I purchased a GEFORCE GTX 560 Ti. I learned from researching your website that I could SLI the two VGAs because they have the same RAM, Processor Cores, ETC. I did not need to use the power adapters. My PSU actually has four of the 6-pin PCI power plugs. Two of them even have the extension for 8-pin. So I set everything up, then I wasn't sure about the external hookup. Do I just plug in the Master VGA? Plug in Both? I searched the internet for SLI with only ONE monitor and couldn't find anything! I still can't believe it. It would make sense that both VGA's need to be plugged in, but my monitor only has one DVI-D and one VGA input. I'm a third degree NOOB at SLI, so please bear with me. Since I couldn't find an answer, I thought since both of the cards came with DVI to VGA adapters, I used one of them on the second card. That way I had both hooked to the monitor. Nope, that didn't work and its obvious why. After 2 hours and 849 curse words I came to the point where I was messing with the Z68 chipset graphics, but I'm so confused! In the BIOS, they have the iGPU option, PCI/iGPU, PCI/PCIE, and PCIE/PCI for boot. Then they have other options, I tried every configuration possible. I messed with that lucid logix program, but since I have one monitor and I know the iGPU shouldn't be the primary for SLI, I turned it off. I have a Dell 1909W flatscreen but it's a few years old and doesn't have HDMI so I went out last night and acquired a 24 inch LCD with HDMI and DVI-D input. I didn't take it out of the box yet because I'm afraid I'll punch a hole in the screen. I'm so frustrated I want to take the new card back to the store and make the guy eat it. I wouldn't do that though, it's not his fault. I know the standard answer would be "Why didn't go just get a GTX 580?" Well, I should have. I originally had an HD 6770 but I sold that and upgraded to the 560ti. Guys I'm at the end of the rope. The P8Z68-V is a fairly common motherboard now...so hopefully someone will be nice enough to help me and tell me how to get this to work. Yes, I tried different drivers, I enabled/re-enabled/uninstalled/took both cards out and completely uninstalled all things NVIDIA and did a fresh installation/280 driver, 285 driver, driver that came with VGA/....I give up. This is my last hope.

Well I'm about to hit 1000 words on this question. Sorry it's so long. I'm just desperate for help and wanted to give as much information as possible. To whoever helps me....I thank you with my whole heart and wish you the best.

Sincerely,
~Christian

More about : desperate

October 23, 2011 7:56:14 PM

Dude,

1) SLI Cards power one monitor. They work together to boost frame rates but you do not plug both cards into one monitor.

2) You absolutely must provide power to both cards. Make sure that your PSU is plugged into all the power ports available on both cards.
Score
0
October 23, 2011 8:02:35 PM

ditto.
plug the monitor into the primary output of the primary card. Make sure you have your SLI bridge installed (didnt see a mention of it and they dont always ship with mobos). Make sure you have all power connected, and that your power supply is capable of supplying enough amperage (separate from the wattage rating) to both cards. I believe it should be enough, but double check.
Score
0
Related resources
October 23, 2011 8:06:32 PM

It seems to me that if you have the option to disable SLI then SLI must be working on your system. Check your device manager to make sure that both your cards are present. If they are then that is all good. If your SLI was not working then the NVIDIA panel should say Enable SLI. The fact that it is saying Disable SLI means it is actually working. You will not notice any difference during normal use. It is only when you are playing intense 3D games etc that your SLI is going to come into its own.

Plug the top card into your monitor. Make sure SLI says disable not enable. Make sure your cards are powered up by having the right PSU plugs plugged into the right power inputs on both cards. Cease selecting Maximize 3D Performance that is not the go button for SLI. As above make sure your SLI bridge is installed.

Enjoy your gaming.
Score
0

Best solution

October 23, 2011 8:35:00 PM

Let's relax and go back to basics. We need to eliminate as many possible causes for the crashes so when we test something, we know its on a good foundation. It will be a longish path, but remember you've done a lot of stuff we weren't there to see, so let's clear the slate.

First a couple of status questions:

1) Please provide the brand and model numbers of your 2 video cards.
2) Please download GPUz from this site:

http://forums.tweaktown.com/gigabyte/30530-latest-overc...

Look at each video card, and tell us if you find any architectural differences between the two cards. (eg, memory/gpu/freq).

Next:
1) Disable or preferably uninstall any Windows utilities that can OC either the mobo or the vid cards.
2) Download a fresh copy of the latest WHQL drivers for your video card, but DO NOT install them.
3) Uninstall all nVidia graphics software using Control Panel.
4) Power down the PC.
5) Remove one of your video cards "we'll call it "card_b", and leave the other installed in the primary PCIe graphics card slot as identified by your mobo manual.
6) Clear CMOS (Use mobo feature or unplug from wall, remove battery, press case "power on" a few times, get a cup of coffee, reinstall battery, plug into wall.)
7) Power up and get into BIOS. Reload (optimized) defaults, making ONLY the changes - if any - that are necessary for your PC to operate correctly.
8) Boot through into Windows.
9) Install the WHQL driver you downloaded earlier.

If I missed anything to get your system back to stone-cold stock settings, please take the additional steps to do so.

At this point you could power down and try SLI (install your second card in the #2 graphics card slot as identified by the mobo manual, install the SLI bridge, and power up). However, I might run four more tests first:

1) Run Furmark Benchmark on card_a in slot 1.
2) Power down, replace card_a with card_b, run Furmark Benchmark.
3) Repeat steps #1 and #2 placing the cards in slot#2 instead.
4) Note and compare results. Anything you can't explain?

If all checks out, try SLI again.

Let us know what happens.





Share
October 23, 2011 8:35:26 PM

Thank you both for the fast reply! Okay, I apologize for forgetting to tell you about the setup. Yes, I installed the SLI Bridge and confirmed that the two separate 6-pin power inputs on each card are plugged in for a total of four 6-pin PCI Power connectors installed. Both cards display in the device manager. Disable SLI is already selected. I should have mentioned that. When I go the the NVIDIA website, it recognizes both cards, but they can't be in SLI mode....my 3D benchmarks are the same or less as when I just had the one card. I only get 20 to 30 FPS on DirectX11, and Windows thinks I have DirectX10 even though Windows 7 comes with DX11 by default. It's always been like that for me though, I'm not sure why. On the nvidia control panel when SLI is enabled you see it and can confirm it. On the image of your setup it says SLI and has graphical represenation via image on your control panel; it shows on a google image search. The correct way that I should have said it before was that the system is in "Disable SLI" by default and the errors/driver crashes/headaches begin when I try to enable it. Is that the only way to enable SLI? Do I have something wrong in the BIOS? The options are 1) iGPU, 2)PCI/iGPU, 3) PCI/PCIE, 4) PCIE/PCI and there is an option to enable or disable iGPU render standby, and even when I have my system on PCIE/PCI which is what I believe is correct, it still makes me select an iGPU memory amount. The choices range from 32 to 512, I go with the default of 64. When I try to enable and the system recovers and I look in the device manager, sometimes one of the cards has the error code 43 and the message that I cut and pasted earlier, sometimes it's both. The amperage looks good. I have many components but with the CPU not overclocked, the external HDD unplugged, no USB except keyboard and mouse....the main thing is the PSUs. The mobo has 3 PCIe lanes, 7 total PCI options. I have the Killer 2100 Network Card plugged into the small PCIX1 on the "top" of the mobo, the first VGA right underneath, then the second VGA in PCIe16 lane 2. They are both running at 8x/8x which is the way it should be. One of the GPU's is handling the PhysX it says, but they both refuse to collaborate in SLI. It has to be something stupid I overlooked like a BIOS setting. I would literally pay money if either of you can get me working. Again, I thank you for the fast response and I greatly appreciate your help. I'm sorry you had to wait for this information before you could help. I did my best to make it so you wouldn't have to ask questions, so I apologize. I have one DVI-D cable running from the port on the Master VGA to the DVI-D input on the monitor. I can try it with my new monitor if you guys want, with only using mini-HDMI to HDMI that came with the card, and the new monitor. I'll take pictures, screenshots, or anything you need to help you help me. Please know that I sincerely appreciate the input from you guys. I built my rig back in September, and I have been coming to this website for about 6 months. I'm glad I finally made an account. In turn, I will help others with things I know the answer to. Thank you guys again.
Score
0
October 23, 2011 8:40:14 PM

First, please use paragraphs to separate your thoughts lol. That will help.

Second, as your system stands right now, there are too many possibilities for problems - so I'd repeat my advice to go back to the basics I enumerated.
Score
0
October 23, 2011 8:57:44 PM

TwoBoxer: I received your post while I was typing my last message. Here is the answer to question one:

1) PNY Brand. Nvidia GeForce GTX 560 Ti, XLR8, VCGGTX560TXPB-OC, 850 MHz Core Clock, 384 Processor Cores, 1700 MHz Processor Clock, 4104 MHz Memory data rate, effective. 1024 MB GDDR5, SLI/DX11/PHYSX/3D Ready.

2) NVIDIA Brand, GeForce GTX 560 Ti 1GB GDDR5, SLI/DX11/PHYSX/3D Ready. Can't find exact model number, but on the box it says the Part Number is 900-11040-2550-000. The Real NVIDIA brand which is the one I just bought, it's the same model and everything, but the PNY is some B.S. "OVERCLOCKED OUT 'DA BOX" marketing scheme and it runs at the frequency posted above. The NVIDIA has the same amount of processor cores, same RAM, only it comes out of the box at 810 MHz instead of 850. I was worried this would be a problem, but the websites say it will just adjust to the slower clock as long as the main parts are the same. I can mirror the Nvidia card to match the PNY in 10 seconds in the control panel.

Normally I use newegg and I disdain Best Buy, but I just got paid and for the first time in two months I didn't immediately need to pay bills so I was browsing their poor selection when I thought to do the SLI. The 560 TI is the highest model they sell, and they only sell the PNY and the NVIDIA. I was going to get the exact same card I already had, but since I was temporally "ballin'" I paid the extra 30 dollars for the Nvidia brand with the knowledge that the clock speeds could be aligned and etc etc.

Both cards have two DVI and 1 mini HDMI out. The only difference is the NVIDIA Brand DVI's are black and the PNY's are white.

My daughter is 4 months old today and my wife needs my help, so I need to get away for a little while to take care of a few things...but I will be back in a few hours and I will execute your instructions you have and post all results in detail. Thank you.

To anyone else that replies, I will try exactly what you say and will get back to you with detailed results. I just need to help my wife right now so I must go. The only thing that will stop me from getting back on tonight is a power outage, a family emergency, or my untimely death. Thank you everyone!!! You guys are great!


First a couple of status questions:

1) Please provide the brand and model numbers of your 2 video cards.
2) Please download GPUz from this site:

http://forums.tweaktown.com/gigabyte/30530-latest-overc...

Look at each video card, and tell us if you find any architectural differences between the two cards. (eg, memory/gpu/freq).

Next:
1) Disable or preferably uninstall any Windows utilities that can OC either the mobo or the vid cards.
2) Download a fresh copy of the latest WHQL drivers for your video card, but DO NOT install them.
3) Uninstall all nVidia graphics software using Control Panel.
4) Power down the PC.
5) Remove one of your video cards "we'll call it "card_b", and leave the other installed in the primary PCIe graphics card slot as identified by your mobo manual.
6) Clear CMOS (Use mobo feature or unplug from wall, remove battery, press case "power on" a few times, get a cup of coffee, reinstall battery, plug into wall.)
7) Power up and get into BIOS. Reload (optimized) defaults, making ONLY the changes - if any - that are necessary for your PC to operate correctly.
8) Boot through into Windows.
9) Install the WHQL driver you downloaded earlier.

If I missed anything to get your system back to stone-cold stock settings, please take the additional steps to do so.

At this point you could power down and try SLI (install your second card in the #2 graphics card slot as identified by the mobo manual, install the SLI bridge, and power up). However, I might run four more tests first:

1) Run Furmark Benchmark on card_a in slot 1.
2) Power down, replace card_a with card_b, run Furmark Benchmark.
3) Repeat steps #1 and #2 placing the cards in slot#2 instead.
4) Note and compare results. Anything you can't explain?

If all checks out, try SLI again.

Let us know what happens.[/quotemsg]
Score
0
October 23, 2011 9:09:59 PM

NP, mate. RL is RL.

At some point we may want to match clocks, but I wouldn't do that until we've shown they won't SLI as-is on a clean system.

And thanks for the paragraphs lol.
Score
0
October 23, 2011 9:42:50 PM

http://forums.nvidia.com/index.php?showtopic=91585

According to a user on this site Error 43 indicates GPU failure at a hardware level. Have you checked that each card works in its own right? Twoboxer has provided steps that must be taken to allow us to narrow the problem down. Once you have completed those then report back.
Score
0
October 24, 2011 5:01:39 AM

Man....Furmark is TOASTY!!!

When I was booting up to start troubleshooting, I tried enabling it again out of pure spite. Went black, then to 800x600 resolution. Opened Device Manager, Top Card was down. Right-Clicked and this time for some reason it let me disable and enable it instead of the entire driver being wiped out of existence. After I enabled it the resolution went back to my monitor's native, which is 1400x900. I opened the Nvidia control panel and I heard little angels singing to me and a ray of sunshine showed "SLI ENABLED" with the green bar between the GPU's! I ran Furmark and did a few other tests and downloaded GPU-Z, and I can still smell the new cards from the heat. I found the maximum operating temperature was 100C....I hit that right at 99% complete. I put the fans on 70% and was back down in the 40 to 50C range in a minute or two. I checked and SLI was still enabled. I will cut and paste the result:

FurMark 1.9.x (and higher) Score
OpenGL benchmark and graphics card stability test
Details for score ID = 221122

» Back To FurMark Scores List
» Get the latest version of FurMark

Score: 5512 points (91 FPS)

Submitted by RelentlessFury on October 24 2011, 5:19 am

Bench duration: 60 seconds
Resolution: 1280 x 720
MSAA samples: 0
Window mode: fullscreen

Primary renderer: GeForce GTX 560 Ti
Device ID: 0x10de - 0x1200
GPU clock: 900 MHz
Memory clock: 2100 MHz
Shader clock: 1800 MHz
Graphics drivers: 8.17.12.8026 (8-3-2011) - GL:nvoglv64
GPU temperatures (start/end):52°C / 100°C
Number of GPUs: 2

CPU: Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
CPU speed: 3300 MHz
Operating system: Windows 7 64-bit build 7601 [Service Pack 1]

My CPU was running at 4.82 GHz, but I assume the program just lists the default frequency for the particular CPU in use. Is that a good score? It's higher than all the ones in the recent history (what looks like the past 48 hours, maybe a little less than 100 submissions) with the exception of 5 or 6 dudes, but that's probably because I ran at 720. I haven't installed my new monitor yet so that's why I didn't do 1080. You guys would know. I'm sure I could tweak the settings and get higher and turn off functions and stuff, but you know much better than me.

With that score, is it worth having the SLI? I don't know what else to get for my computer at the moment. I will buy the i7 2700k when it comes out whenever that is...but I request your opinion on the matter.

I want to thank you guys for helping me. I especially appreciate the link you gave me for GPU-Z! Normally I just google it and get it from the source, but that link has hundreds of tools to play with!!! Thanks!

I saved the profile and if I have trouble again I will follow the same procedure. I look forward to hearing your opinions and advice. I'll be turning 28 on Tuesday and with the SLI working I should have a good birthday. Now all I need is for Blizzard to stop playing with my emotions and release Diablo III. When I first started playing Diablo 1, my HDD was 10GB....and that was considered good. haha good old times. Goodnight gentlemen and ladies, and thank you.
Score
0
October 24, 2011 5:02:26 AM

Best answer selected by RelentlessFury.
Score
0
October 24, 2011 5:03:17 AM

Wamphryi, I wish I could pick two best answers... I'm sorry
Score
0
October 24, 2011 5:20:27 AM

No worries Dude :-)
Score
0
a c 173 Î Nvidia
October 24, 2011 11:15:00 AM

This topic has been closed by Mousemonkey
Score
0
!