I run a research core lab, my users collect large sets of data from multiple instruments. Users have data folders on each of the workstations attached to the instrument. I Currently have a OS X server and RAID drive for the archival of their data. Each user has an account on the server and I have simple smb scripts for mounting the drive of each workstation and copying the data over to the individual user folders on the server.
I need a new system since the current server is getting old and I have >8Tb of data and growing. I want to maybe use a NAS server but I do not know if that is overkill since really all I am doing is providing a file storage system that users have access to all of their date securely. Having a web based UI for them to access data remotely would be idea and having an easy way to automate the data being pushed or retrieved at the server would make my job so much better.
I don't have IT support at my institution so I have to find a solution that I can manage I can write simple scripts and get by but a more user friendly solution would be ideal.
I need a new system since the current server is getting old and I have >8Tb of data and growing. I want to maybe use a NAS server but I do not know if that is overkill since really all I am doing is providing a file storage system that users have access to all of their date securely. Having a web based UI for them to access data remotely would be idea and having an easy way to automate the data being pushed or retrieved at the server would make my job so much better.
I don't have IT support at my institution so I have to find a solution that I can manage I can write simple scripts and get by but a more user friendly solution would be ideal.