on Sun, Aug 12, 2001 at 02:46:42AM -0700, patrick q ([EMAIL PROTECTED]) wrote: > Hi, > > I have a lot of important archives, up to 10,000 files per ~10 Meg > tar.gz tarball, that I like to keep as safe as possible. > > I test the archives when I create them, have backups, and off-site > backups of backups, but I am worried about possible file corruption, ie > propagating possibly corrupt files through the backup rotation. > > Would it not be better to compress the files individually first and > then tar them into an archive instead of the normal tar.gz operation, > to have the best chance of recovering as many files as possible?
Explore an alternative file format. afio, cpio, and pax are all somewhat intended to superscede tar (none has the ubiquity of tar, however). Among their benefits, compression is applied to individual files within the archive, and recovery from fuxnored archives is supposed to be much better -- that is, if your archive file is botched in one spot, most of the other data should be recoverable. The tar format doesn't support this nearly as well. Your best bet is multiple, redundant, backups, with full verification. I build same into my own home backup system by: - Keeping a fairly good backup cycle: every few days. I'll lose a week's data, worst case, with typical practices. Largely email. - I do a full verify of my backups. Errors are logged. - I keep a set of tapes, in rotation, for redundancy. More data: http://kmself.home.netcom.com/Linux/FAQs/backups.html -- Karsten M. Self <kmself@ix.netcom.com> http://kmself.home.netcom.com/ What part of "Gestalt" don't you understand? There is no K5 cabal http://gestalt-system.sourceforge.net/ http://www.kuro5hin.org Free Dmitry! Boycott Adobe! Repeal the DMCA! http://www.freesklyarov.org Geek for Hire http://kmself.home.netcom.com/resume.html
pgpZL7Mt4vmK9.pgp
Description: PGP signature