Hi,

Richard Owlett wrote:
> Recently I was suggested I read
>   https://www.gnu.org/software/xorriso/
> and
>   http://scdbackup.sourceforge.net/main_eng.html
> which led to exploring "afio archives" and "zisofs compression".

afio is a sequential archiver. I used it in scdbackup mainly because of
its gzip-per-file compression feature and its higher attribute fidelity
in comparison to mkisofs (which is "not a backup tool" according to its
programmer).
The sequential aspect means that retrieval of files would have to sift
through all data before the file data until it is finally found.

xorriso with zisofs compression and extended attribute support would be
clearly to prefer, because the backup can be mounted by the Linux kernel
and zisofs will be decompressed by the kernel when files get read.

Of course there are lots of other backup systems standing ready to keep
copies of your work disks on other disks or USB sticks.
The answers to the questions below will encourage or discourage their
use, depending on their concepts and properties.

scdbackup will probably not be of much help, as its specialty is to
copy disk sized backups onto a pile of optical media of equal size each.


> I'm thinking of creating multiple partitions with human readable partition
> labels such as "Machine_1" ... "Machine_N" and "Project_alpha" ...
> "Project_omega".
> Each partition will then have files named partitionlable_a.iso ...
> partitionlable_z.iso.

Yes. Independently of the backup tool you need a backup plan.
- What goes where ?
- Where to look for a particular file which needs to be restored from backup ?
- How to verify the completeness of the backup after its creation
  and how to verify its valid readability after creation, during storage,
  and at restore time ?

If you plan to work with a cigar box full of USB sticks then you will
need some rugged and well readable labeling system for your eyes,
or a brush like contraption of USB hubs to have them all online.


> Any suggested reading on pros, cons, howtos?

If it shall be done by xorriso, i'd plan for incremental backups from
original file trees which are about half as big as the backup media or
the planned room for the backup data files.
Depending on how many changes happen on the original, this will offer room
for a few dozen or several hundred daily updates. All read-only on filesystem
level and each day mountable with its complete file tree.

See man xorriso example "Incremental backup of a few directory trees" and
replace
  -dev /dev/sr0
by something like
  -dev stdio:/dev/sdh
  -dev /mnt/usb_stick/home_backups/2019_07_12.iso

About other backup systems i am probably not the right one to ask.
I derived my own strategies from times when a backup consisted of a pile
of QIC tapes with throughput counted in megabytes per minute.
I went my own ways since i switched to CD in 1998.


Have a nice day :)

Thomas

Reply via email to