There are multiple layers of checks done to ensure that hte data sent is the same data received depending on the methond you use to transfer the data. For starters if you're using a TCP based protocol like FTP, there are checksums done on each packet to ensure the integrity of the contents of the packet. In additon to this, there can be checksums done at the file level by the transfer protocol such as with FTP.
Nothing is ever 100%, I've seen NTFS itself become corrupt eventhough the underlying hardware was 100% good. A backup doesn't just mean another copy of the data but also a way to rollback should you recognize a problem. For your piece of mind, try keeping a week old image and a current image. This way if you ever identify a problem you can rollback a specific file back up to a week. If you're running Windows server or something that supports shadow copy, you can also use this to revert back to a previous version of a file while not using a tremendous amount of disk space.