I'm running four 1TB Seagate drives in RAID10 using my mobo's hardware RAID drivers. For a few days, everything worked fine, but now it's saying disk four has a critical failure during startup. I've run Seatools to find the bad disk, but all four go through the long test w/o issue.
I can still RMA the drives, but the issue is knowing which one is causing the problem. How can I find this out?
Also, if I send in one, am I still able to run the RAID without that fourth drive connected while I wait for the replacement?
these drives support raid 0+1 not raid 10 (0+1).
I'm having same problem on extreme rapage asus motherboard, highend server just built w/ icore 7; 12g ram- window7-64 (highend system).
Mwave built it configured four drives on Raid 10.
Went to seagates online spect. an indicates these drives supports raid 0+1.