RAID 10 Drive Failure, questions

quicksilverxii

Distinguished
Oct 24, 2010
1
0
18,510
I'm running four 1TB Seagate drives in RAID10 using my mobo's hardware RAID drivers. For a few days, everything worked fine, but now it's saying disk four has a critical failure during startup. I've run Seatools to find the bad disk, but all four go through the long test w/o issue.

I can still RMA the drives, but the issue is knowing which one is causing the problem. How can I find this out?

Also, if I send in one, am I still able to run the RAID without that fourth drive connected while I wait for the replacement?

Not sure how much of this is needed, but here's the mobo and hdd specs:
Motherboard: DFI LANParty DK 790FXB-M3H5 (http://www.newegg.com/Product/Product.aspx?Item=N82E16813136067)
HDD: 4x Seagate Barracuda 7200.12 ST31000528AS 1TB (http://www.newegg.com/Product/Product.aspx?Item=N82E16822148433)
OS: Windows 7 Home Premium x64
 

sammyson

Distinguished
Feb 8, 2011
1
0
18,510
these drives support raid 0+1 not raid 10 (0+1).
I'm having same problem on extreme rapage asus motherboard, highend server just built w/ icore 7; 12g ram- window7-64 (highend system).
Mwave built it configured four drives on Raid 10.

Went to seagates online spect. an indicates these drives supports raid 0+1.