Print

Print


Modern filesystems are good at detecting and repairing corruption. They
need to be because it would cause loads of problems if bits were regularly
getting flipped or lost.

I've never understood why we only worry about corruption in the information
objects we manage? After all, corruption in the OS, any hardware and
software component involved in making the checksum calculations, and the
checksum itself are also possible and some of those would have much more
profound consequences than changing a bit in an image, document, etc.

Checksumming using external utilities strikes me as more resource intensive
than necessary on an IO, CPU, or memory basis. Better to go ZFS or a
service like Amazon where integrity is baked in at a lower level unless the
real concern is detecting modification by humans or software in which case
a recovery plan to get an uncorrupted version back in is necessary.

kyle


On Fri, Mar 31, 2017 at 7:36 AM, Bigwood, David <[log in to unmask]>
wrote:

> Open for suggestions for a fingerprinting tool. I think that's what they
> are called.
>
> We are copying NASA imagery to make available to the public. They send us
> a hard drive with about 10,000 images and we copy them to a drive then
> return the original.
>
> I'd like a tool that would compare the 2 drives to make sure everything
> was copied correctly. Then something that would be able to tell if some
> file had degraded, went missing or been changed over time.
>
> Thanks,
> David Bigwood
> [log in to unmask]<mailto:[log in to unmask]>
> Lunar and Planetary Institute
>