Had a question about data integrity. How common is data corruption these days? My concern is that I currently have a samsung spinpoint 1TB, very pleased with it but I plan on backing up a ton of music and movies to it. My budget is very limited and I may supplement my storage with external drives.
My question is... when one transfers large amounts of data to an external source or drive how do we know its not changing or being corrupted even on a figuratively microscopic level? I use dbpoweramp I want to make sure that once I rip my music and movies they are bit by bit perfect if I have to transfer them to external sources and THEN back again or as many times as I please. Thank you in advance.
There are multiple layers of checks done to ensure that hte data sent is the same data received depending on the methond you use to transfer the data. For starters if you're using a TCP based protocol like FTP, there are checksums done on each packet to ensure the integrity of the contents of the packet. In additon to this, there can be checksums done at the file level by the transfer protocol such as with FTP.
Nothing is ever 100%, I've seen NTFS itself become corrupt eventhough the underlying hardware was 100% good. A backup doesn't just mean another copy of the data but also a way to rollback should you recognize a problem. For your piece of mind, try keeping a week old image and a current image. This way if you ever identify a problem you can rollback a specific file back up to a week. If you're running Windows server or something that supports shadow copy, you can also use this to revert back to a previous version of a file while not using a tremendous amount of disk space.