On Wed, Dec 25, 2019 at 11:07:22AM -0800, David Christensen wrote: > > I was amazed that nobody yet considered tar.
The best use case for tar is creating a full backup to removable media (magnetic tapes are literally what it was designed for -- the "t" stands for tape). The drawback of using tar is that it creates an *archive* of files -- that is, a single file (or byte stream) that contains a mashup of metadata and file contents. If you want to extract one file from this archive, you have to read the entire archive from the beginning until you find the file you're looking for. Remember, tar was designed for magnetic tapes, which are read sequentially. It provides no way for a reader to learn that file xyz is at byte offset 31337 and that it should skip ahead to that point if it only wants that one file. Tar also provides no means to *update* the copy of a file contained within an existing archive. I.e. you can't do any kind of incremental or differential backup with it -- not realistically. The closest you could come would be appending a series of binary patches to the end of the existing archive. (Appending to an archive only works if the archive is uncompressed, by the way.) It's certainly not *wrong* to do backups using tar, but for a lot of people, it's not the strategy they want to employ. For most people, a backup using rsync to a removable *random access* medium (an external hard drive, or USB mass-storage device that acts like a hard drive) is a much better fit for their needs.