I am after a good 4 port e-sata PCI board with the e-sata ports on the outside.
I am primarily after good quality RAID 5 implementation. Its all about data recovery. I need to know when a HDD fails, and need to know that the card can recover with a new HDD with a minimum of fuss.
I have no idea how the quality differs between manufacturers. I would prefer not to have to pay enterprise costs for the board, so good cost for quality is key.
And another question on the RAID implementation. Is the RAID implementation on a hardware RAID card universal? If the card itself has a fault down the track, can I get any current RAID card that supports RAID5, and expect the 4 drives to work? Or do I have to get that exact card again?
Weird, I made a post before my last one, butit seems to have disappeared.
Anyway, I Have pretty much settled on the enclosure and the HDDs, but just want to get a good hardware SATA Raid card. I would spend up to $200 USD. I see a lot for well under $100, but am not sure if these are true hardware RAID cards. It is really hard to see from their advertising if the RAID that the card supports is hardware based, or a piece of software that ships to implement RAID.
I really want easy recovery, and trustworthy recovery from HDD errors in the RAID5 Array.
PCIe 16x is most likely shared with your Video, keep that in mind when bench marking. Also any raid controller you buy is 99% hardware raid, otherwise it would just be a controller. Raid 5 is the way to go because the online spare gives you the ability to have seamless failover. However your choice could be better, try doing some more research, there are some kits that come all in one for about the same price you'd spend after getting the controller separately. Also trey the Intel's RAID 5 NAS, its not to bad, just no HTTPS support.
I am thinking about the NAS devices. I saw one on this forum that looked good. But the total price was a bit higher than this setup, and I have the server on all of the time anyway. One other major issue is that the majority of the work for this server and storage is for recording and playing TV. I also snip out interesting bits and compress them for keeping.
In order to do this, I have to make a complete pass through the video, to fix the time line, then I can rip out the part worth keeping, and compress it.
Given that this would all be over network traffic, and there is already the original traffic of the TV show itself, possibly times to at a single point in time, and maybe a time shift play as well, that would all up be about 40 to 50mb/s. Should be fine on the 100mb/s setup I have, but not sure if I want to risk it. I could move it all to gigabit, but there is more cost.
Out of curiosity, do you have a link to a NAS setup that you think is good quality?
I kinda do like the idea that the NAS setup should be set up ready to go, and should handle the RAID 5 recovery pretty easily. But on the down side, I am not sure how well that whole system will go if I do have troubles.
1) This review says you could potentially put in 750GB drives, but the Intel site for this device says up to 400GB only. I am not sure which is correct.
2) Seems like there are troubles dealing with larget files over 800MB. Most of my files will be large encrypted backup files, so this is undesirable.
3) The IBM site does not mention RAID5, which I want to use.
4) Just doing some quick math, sustained read/write speeds of new HDDs are about 75MB per sec. Burst is a lot quicker, but I will forget about that. This is 600 Mb per sec. I would have 4 drives in total, in RAID 5. So, theoretically, the total data speed of these drives should be 3 times the single drive speed, assuming no slowdown by the RAID 5 calculations. This is 1800Mb per sec. This would be over twice the real speed of any 1Gb network. So there is a lot of speed lost in the netork. I would be doing a lot of reading/writing to this from the server itself, for recording Television, so this would slow down the network with more traffic that would not happen if the drives were directly attached.
5) The article said that the NAS does not support HTTPS for login. This is not cool. I kinda have full control over my network, but I am not sure it is wise to have unencrypted logons passing around the network. But then again, my router configuration does not allow HTTPS either...
If the OS could be updated and upgraded, and particularly if others started an alternative software and OS to run on the NAS device, that could be pretty cool.
I would say the biggest issue right now is the large file issues, and the biggest issue down the track is the slower data speeds between my server and the NAS device
FTwo separate Gigabit connections are for failover (Team configs) or bridging connections. You can also use it for content redirection as well.
1) I not 100% sure, but I've seen other forums that have the Nas with the 750gb drives in them. I think it may be a driver issue. Just check (e-mail) Intel to settle doubt offically.
2) I notice this aswell, but I think it was do to packet loss and colision that was over looked in the testing. I believe it was a networking problem. l
4) Your network would have to be a gig wide open, your theoretical calculations are correct, but not as far as going out over the network is concerned, those rates would be internal. Unless you find a NAS with fiber, this is a industry wide bottleneck.
5)I hope you have full contol, lol. But yes I did notice that and would suggest upgrading your router, and setting up a free VPN that lies in front for you firewalled server. You can use SSL or SSH as well.
I would not worry about the speed, down the line its your network you'll be upgrading. Fiber will be as cheap as cat6 someday (i hope). Try looking at Thecus's products, they have one high end NAS that has HTTPS support as well as great backing by the TOM'S community.
Dont do anything you dont have any comfort with. Ask around, I have seen these NAS at many client sites as temp storage while upgrades are being issued. In most cases they wind up using the extra terrabyte for backoffice development.
Thanks CcashCow for all your help. I might stick with the 4 HDDs directly connected to my server for now. It would have been really nice to be able to separate them via the network, so they did not have to be co-located, but it is really not going to work for what I need at those slow speeds. Having said that, I am considering rethinking my plan, and leaving the current 500GB in the server for heavy work, and using the NAS as an ultimate backup, which could work really nicely. The backups would be slow, but I could perform them all overnight.
What I meant about control over the network is that I share it with my brother, and therefore have "theoretical" control over it :-)
And thank you SupremeLaw for your info. I have looked at heaps of options for SATA cards, including internal SATA, ESATA, and the multilane options. I like the multilane, but the cost does go up. Also, I ahve only one full size card slot free and one half height PCI extreme free. So I do not have room to fit a SATA RAID card, and a blanker space to externalise the SATA cable(s). So for that reason, I am looking for an eSATA option in a single card.
What I really need to figure out, is what level of hardware RAID each card offers. Some people are saying you cannot get real hardware RAID support for less than $300 to $400 USD. This is a little steep.
I am assuming that the Intel NAS device uses software RAID, implemented on the 400Mhz chip, in Linux.