EVGA (NVIDIA) nForce 680i SLI - RAID 0 (2 sets) problems. Anybody?

As in my title, I have an NVIDIA nForce 680i SLI board.

I have 2 WD Raptor 10k rpm 150 GB drives set up in RAID 0 (enabled RAID in BIOS and the "F10" RAID setup area
and the 2 SATA channels they're on. No problems.

I have now a 300 GB drive which the system calls "NVIDIA STRIPE 279.47G"

My OS (Win 7 x64) and all my programs reside on it. No problems with anything.

Now, I installed 2 Seagate 7200.12 320 GB drives and set them up as a new (2nd)
RAID 0 array. All set up as before with no problems at all. (but see below)

The BIOS recognizes the 2 Raptors and 2 Seagates. Like I keep repeating, no problems at all.

Boot-up screen reports 2 healthy RAID 0 arrays.

I booted into Acronis Disk Director 11 with it's bootable CD and initialized and formatted the 2
new Seagate drives, which it recognized and accomplished it's mission with no problems at all.

HOWEVER, when I boot into Win 7, there are no drives except my original WD Raptor array!

NOWHERE is my Seagate array to be found (Device Manager, "Computer" hard drive listing, Disk Management,
nor any benchmark, AIDAx64 Extreme, HD Tach, HD Tune, CrystalDiskInfo, command prompt, CrystalDiskMark,
or any other program) at all.

Turning the computer completely off and letting it sit for a little while (per the advice of a poster
on another forum) and back on, still no joy.

I see here in the Tom's Hardware forums that the 680i will indeed utilize 2 RAID 0 arrrays at once,
which I thought may be the problem initially, that the board simply didn't support 2 arrays at once.

Does anyone have ANY idea at all what my problem may be?

Per one other post I saw, I may have to Ghost my present Win 7 and program setup, then try re-installing
Windows from scratch, whence it will detect and use the 2 arrays at once.

I seriously hate to have to do that since I have a TON of programs to re-install if I do that (takes about
2-3 days work to get it all back.

Of course, I can't try that and if it works, Ghost my current setup back, since that will probably result
in the same problem I have now.

I do have the most current NVIDIA drivers installed, Win 7 updates, etc.

Thanks in advance for any help,

BTW, here are the results I get with benchmarking, if it helps to figure anything out
(this is the one and only array the OS now sees, the WD Raptors)

Results are pretty similar, except Burst, which is probably due to some differing algorithm)

1) HD Tach v3 (Quick bench only, since Long Bench almost always gives very similar results)
Burst / Average read / Access time
229 MB/s / 145 MB/s / 8.3ms

2) HD Tune Pro v5
Burst / Average read / Access time / Minimum / Maximum
120 MB/s / 135 MB/s / 8.3ms / 102 MB/s / 168 MB/s

- Windows 7 64-bit Ultimate SP1

- EVGA (nVidia) 680i SLI (122-CK-NF68)

- Intel Core 2 Duo E6600 Conroe 2.4 GHz, OC to 3.2 GHz (extremely stable)

- 4 GB OCZ Raptor DDR2 PC2-6400 1066MHz 5-5-5-15 2T(will not OC at all)

- EVGA nVidia GeForce 8800 GTS 640 MB (OC Core/Shader/Memory to 650/1505/950. Can't OC much above this.
Oddly, if you start to get too carried away, the 1st thing you'll notice is ripping and "artifacting" the Sidebar.
Whole Desktop ripping next if you keep it up, then complete BSOD)

- HP CDDVDW TS-H653R CD/DVD burner

- Creative SB Audigy 24-bit sound card (so-so, but better than onboard sound)

- Antec TruePower Trio 650 watt PS

- Zalman CNPS9700LED CPU Cooler (the huge one. Says "Minimized weight" but seems like it weighs 10 pounds.
Almost appears to take up about 1/3 of the inside of your case. However, it maintains (never above)
about 50-55C at 100% burn-in. This particular CPU has a Tjmax of 80C, but I've never gotten within
25-30C of that)

- Generic thermal grease from Radio Shack (Yep)

- Antec Nine Hundred Black Steel case (I love this baby. Has a huge 200mm fan in the top of it and 3x
120mm fans included)

2 answers Last reply
More about evga nvidia nforce 680i raid sets problems anybody
  1. [SOLVED]

    Title of original post: EVGA (NVIDIA) nForce 680i SLI - RAID 0 (2 sets) problems. Anybody?

    Original problem posted:
    Have 1 RAID 0 array set up using 2 WD Raptors 10k rpm 150 GB drives. Is my C: drive and
    contains OS and all programs. Works fine.

    Tried to set up a 2nd RAID 0 striped array using 2 Seagate Barracuda 7200.12 320 GB drives.
    Major problems. Simply wouldn't work. Detected fine in BIOS but not in Windows 7.

    Thought it was the motherboards problem, EVGA nVidia 680i SLI. Board is fine and that
    wasn't the problem.

    Got it fixed.

    The very short and sweet is that I went into Device Manager> Storage Controllers and right-clicked NVIDIA
    nForce RAID Controller and then Uninstall.

    Re-booted and the "Installing New Device Drivers" box came up and installed drivers for approx. 9 separate
    drivers, of which 6 said "Restart Required".

    Re-booted and all is wonderful.

    1st RAID 0 array (2x WD Raptor 10k rpm 150 GB drives, which is my C: drive, contains OS and all programs))
    and 2nd RAID 0 array (2x Seagate 7200.12 320 GB drives) both present and accounted for and working like a charm.

    That's how it finally worked but here's what I went through to get it that way:

    As everyone who owns an nVidia nForce 680i SLI board knows, the SATA connectors are labeled wrong
    in the manual.

    I went through a lot of painstaking and lengthy testing when I first got it 3 or 4 years ago to find out
    which was which, but I lost my notes and had to break out the manual (which I luckily still have).

    I had marked in there which connector was which, but my current testing showed that that wasn't right either!

    To start new and find out the information I needed I simply unplugged all cables from all connectors on the
    board and used a single CD/DVD drive cable to see what was shown during POST (Power-On Self-Test; the white
    letters on a black background when you first turn your computer on, if anyone doesn't know) that it was
    actually connected to, versus what the manual said it was connected to.

    It took about three hours to check all the connectors since I have to remove then replace my RAM (which has
    large heat sinks on the modules and my video card which is about as big as a paperback book but thicker
    and longer.) every single time.

    The 2 connectors that are kind of what you could call "parallel to the board" versus the 4 connectors facing
    straight toward you, as you're looking in the case straight at your board are VERY hard to get at and plug
    cables into.

    I imagine there are a few folks out there who got tired of fighting them and took the board out of the case,
    plugged in the 2 cables and replaced the board, that's hard to get to they are.

    So, here are the REAL connector numbers and addresses:
    The 2 that are so hard to get at are #1 (A0) and #2 (A1), which in the manual are labeled #5 and #6.

    The other 4, from top to bottom as you're looking into your upright case:
    #3 (B0)
    #5 (B1)
    #6 (C0)
    #4 (C1)

    In the manual: (completely wrong)
    #1 (A0)
    #2 (A1)
    #3 (B0)
    #4 (C1)

    It was and is my strong belief that when setting up 2 drives in a RAID 0 array (or 4 drives in 2 RAID 0
    arrays, etc.), the addresses (versus the actual number of the connector) is extremely important.

    For example, if you have 2 drives to set up a single array, they have to be plugged into adjoining addresses,
    not adjoining numbers. i.e. One drive plugged into (A0) and the other plugged into (A1), one plugged into (C0) and
    one in (C1), and so on.

    Coincidentally, the only connectors that will allow you to bypass this rule are #1 and #2, which are (A0) and

    The other connectors (4 of them), do not have addresses that match their connector numbers. Once again,
    I strongly advise using adjoining addresses, and not adjoining connector numbers.

    You have to take into account, however, that when you plug at least two cable in, the order they're
    listed during POST changes (but are still on the board as I've listed), such as when you have 1 drive plugged
    into connector #3 (B0) and 1 drive into #5 (B1), they will now likely be listed on the 2 bottom spaces of
    the POST list.

    This order of listing top to bottom on the POST screen has nothing to do with anything. Use the order of
    connectors that I listed for the board to plug into what it is that you're wanting to plug into.

    I eventually ended up using #1 and #2 for my 2 WD Raptors and #6 (C0) and #4 (C1) for my Seagates and, (I don't
    think this part matters), just use an empty connector for any other single drives (I used #5 (B1) for my SATA
    CD/DVD burner).

    Firstly, go into the BIOS by pressing Delete key at bootup and go to the "Integrated Peripherals" page and then
    the "RAID Config" section to turn on (in my case) the main "RAID" function (of course), then "Enable" SATA 1,
    2, 4, and 6.

    (My #1 and #2 were already on, since I already have 1 array. Getting a 2nd array to work is the complete cause
    of these whole posts of mine)

    When you save and exit, the POST will begin again and when it gets to the RAID setup part of the screen, you
    may have some flashing red letters proclaiming you have a RAID problem.

    Well, of course you do because you just set 2 additional drives (for your 2nd array) in the BIOS as RAID enabled
    but you never set the array itself up in the (what I call the "F10" section, because that's the key you have to
    press to get there) "RAID building" screens.

    Just go through the prompts, making sure to use "Striped" as the type and "Optimal" for the sectors.

    Here's a recap:
    I plugged the cables into the proper connectors (taking into account the addresses, not the connector numbers,
    set them up as a new array and booted into Windows.

    I then went to Device Manager and deleted/uninstalled the drivers for the "NVIDIA nForce RAID Controller".
    (Under the "Storage controllers", not Disk drives" section)


    The "New Driver Installation" will pop up and install about 9 (for me anyway) "new" drivers. About 6 will say
    "Restart Required" beside them.

    Re-boot again.

    You will probably see an error message during/before the boot into Windows that says "MBR can't be recovered
    on (whatever your new drive letter is)".

    Ignore this, because you've never formatted the drive or anything yet, so of course the MBR can't be recovered.

    Now, a couple of problems.

    I started to go into "Disk Management" (right-click on the "Computer" icon on the Desktop and select "Manage",
    then click "Disk Management") to partition, change drive letters, format, etc. my newly working 2nd RAID array.

    When I did, the "Connecting to Virtual Disk Service" stayed on for over 5 minutes.

    I guess I could have waited to see if anything would finally happen, but I got impatient and I have in my arsenal
    plenty of software to partition, format, etc.

    I really favor Acronis Disk Director Suite since it's super easy to use and you basically don't even have to
    know much about what you're doing to use it.

    In the past, I've used the usual free ones such as (Web addresses for downloads below) Diskpart, Data Lifeguard
    Tools (comes with Western Digital drives), SeaTools, (comes with new Seagate drives), Ranish Partition Manager,
    GParted, KDE Partition Manager, and 1 or 2 others that I can't thnk of right now.

    Anyway, I used the bootable Acronis CD and carried out my business and when I re-booted into Windows again, there
    were all my drives and partitions working like a dream.

    As a side note, when I booted into Acronis and first saw the new array/drive (shows up as a single drive
    (of course)), it had all kinds of bizarre "partitions" on it.

    It may just be me, Acronis, the earlier "Connecting to Virtual Disk Service" failure, something about my system
    or some other thing, I don't know and to tell the truth, I don't care. I simply used Acronis to delete them all
    and make my own.

    For example, it had like 6 different partitions on it ranging in size from 2x 298 GB partitions, 1x 879 GB
    one (huh?) and 2 or 3 10 MB to 50 MB ones on it.

    Like I said, bizarre. Very easy to fix though, took about 60 seconds to get rid of them and get my partitions
    and formatting like I wanted them.


    Ok, I listed just about everything possible about how I got my nVidia 680i SLI board (which was never the problem,
    except the part where the manual told you to connect to the wrong connectors) and Windows 7 x64 coordinated and
    able to see and use 2 separate RAID 0 arrays, with a total 4 hard drives with 2 committed to each array.

    If I left anything out and someone is going through the same hassles I did, just e-mail me at cetus thirty five
    at yagoo dot ???.

    (Silly anti-spamming technique :). Replace the "thirty five with the numbers "35" to get "cetus35" and the "dot" with
    a . and the "yagoo" with "yahoo" and, naturally, the "???" with "com". You get the picture.)

    As always,

    - Acronis Disk Director 11 Home trial (make a bootable CD with it and boot with it versus installing it in Windows
    and using it from there)
    (Click on Free Trial button)

    - Diskpart (a Windows utility)- boot with Win 7 installation disc and then go to "Recovery" versus "Install now",
    click on "Command Prompt", type in "cd.." (cd dot dot), hit Enter, then type in "cd windows", hit Enter,
    then type "cd system32", hit Enter, then type in "diskpart", hit Enter, type "diskpart /?" to get the options
    on how to use it.

    - Data Lifeguard Tools (Western Digital hard drive utility) (Unfortunately, the Data Lifeguard Tools is now Acronis
    True Image WD Edition Software and is a pretty good utility but it doesn't help with partitioning)

    - SeaTools (Seagate hard drive utility (Sadly, this software has also given way to newer utilities, which do
    everything in the world except partitioning.)

    - Ranish Partition Manager - http://www.ranish.com/part

    - GParted (Gnome Partition Manager) probably the best free one out there.
    Under the "GParted Live CD/USB/HD/PXE Bootable Image" section download the "stable release" and burn the .iso
    image to a CD to make a bootable CD. Boot with it and take care of your partitioning needs.

    - KDE Partition Manager (arguably even better than GParted)
    Download the file listed at the right of "Looking for the latest version?" Burn .iso image to CD to make a
    bootable CD. Boot with it and carry on.

    Here are my new benchmarks:
    (Sorry about all the (_) underscores, but this editor won't let me space properly)

    HD Tach v3
    Burst_____Average Read____Access time
    301 MB/s__127 MB/s________13ms

    HD Tune Pro v5
    Burst_____Average Read___Access Time____Minimum___Maximum
    158 MB/s__111 MB/s_______13ms___________27 MB/s___160 MB/s

  2. Thanks for a very informative assessment of RAID on a fairly old MB w/ a old model chipset. So, it was the drivers? Glad you found the solution to the problem.
    I even copied and filed this.
Ask a new question

Read More

NAS / RAID SLI Nvidia Storage