> I was wondering if you've considered an option for making file chunks, i.e.,
> if I have a 500 GB file that needs to go over the internet I can choose 1 GB
> chunks (1 GB after compression) - while I do use an ftp with the ability to
> keep going after interruption, it is still helpful to have breakpoints
> to restart from in case of failures it can't recover from.
>
Apply "dd bs=1M count=2048 skip=..." or perhaps "split --bytes=2G"
to create the chunks before applying lzip. Do this in batches of
10 chunks at a time to reduce the usage of temporary space.
Years of experience with zlib shows that even using blocks
as small as 1MiB costs about 1% or less. Only colossal jackpots
("deliberate" repetitions where length >= 10,000) would make this
not true, and if you have many of those then you should use some
technique based on the original pieces instead of "blind" compression
on the concatenation.
--
_______________________________________________
Lzip-bug mailing list
[email protected]
https://lists.nongnu.org/mailman/listinfo/lzip-bug