I run a research core lab, my users collect large sets of data from multiple instruments. Users have data folders on each of the workstations attached to the instrument. I Currently have a OS X server and RAID drive for the archival of their data. Each user has an account on the server and I have simple smb scripts for mounting the drive of each workstation and copying the data over to the individual user folders on the server.
I need a new system since the current server is getting old and I have >8Tb of data and growing. I want to maybe use a NAS server but I do not know if that is overkill since really all I am doing is providing a file storage system that users have access to all of their date securely. Having a web based UI for them to access data remotely would be idea and having an easy way to automate the data being pushed or retrieved at the server would make my job so much better.
I don't have IT support at my institution so I have to find a solution that I can manage I can write simple scripts and get by but a more user friendly solution would be ideal.
currently users are only connected to the server to access and copy the data off to local machines for analysis. Data is sent to the server each night as part of a cron script that mounts each machine copies the data unmounts the drive and goes to the next one.
Ideally having the data upload to the server on logout of the application on the instruments to each user folder so that it is immediately accessible via FTP or a Web UI would make it more seamless. I have >400 user accounts which consist of basically a data folder that the user has permission to and no one else that logs in can access. My instruments generate between 10-50Gb a day they all run Windows XP or Windows 7.