/ Sign-up
Your question

RAID 10 Drive Failure, questions

  • NAS / RAID
  • Seagate
  • Storage
  • Product
Last response: in Storage
October 26, 2010 3:51:30 PM

I'm running four 1TB Seagate drives in RAID10 using my mobo's hardware RAID drivers. For a few days, everything worked fine, but now it's saying disk four has a critical failure during startup. I've run Seatools to find the bad disk, but all four go through the long test w/o issue.

I can still RMA the drives, but the issue is knowing which one is causing the problem. How can I find this out?

Also, if I send in one, am I still able to run the RAID without that fourth drive connected while I wait for the replacement?

Not sure how much of this is needed, but here's the mobo and hdd specs:
Motherboard: DFI LANParty DK 790FXB-M3H5 (
HDD: 4x Seagate Barracuda 7200.12 ST31000528AS 1TB (
OS: Windows 7 Home Premium x64

More about : raid drive failure questions

October 27, 2010 6:19:35 PM

Go into the RAID setup during boot. View LD setup. It will tell you which Sata port the defective drive is on. Unhook this drive.

After that, I have no idea.
February 8, 2011 11:23:42 PM

these drives support raid 0+1 not raid 10 (0+1).
I'm having same problem on extreme rapage asus motherboard, highend server just built w/ icore 7; 12g ram- window7-64 (highend system).
Mwave built it configured four drives on Raid 10.

Went to seagates online spect. an indicates these drives supports raid 0+1.