On 06/12/13 09:18, Carl Wilhelm Soderstrom wrote: > On 12/05 06:22 , [email protected] wrote: >> My question is: can I simply copy /var/lib/backuppc content to a new storage? > If you mean to do this, use 'dd'. Any file-level copying mechanism will be > deathly slow due to having to follow all the hardlinks. It might take 4 > hours to do with DD what would take a week with tar or cp. > >> What would happen if some files from the old storage are not copied? BackupPC >> can be automatically aware of this and do a full backup of those files? > I think BackupPC will handle this gracefully when the hash for the file is > looked up in the pool. If the hash for the file is not found, a new entry > will be created. > I think this setting might also be relevant: $Conf{RsyncCsumCacheVerifyProb} = '0.01';
Presuming you use rsync with checksum caching. Without looking into too much detail, I would probably change that to 1.0 and then run two full backups.... just to be make myself feel better, and then change it back to 0.01. Regards, Adam -- Adam Goryachev Website Managers www.websitemanagers.com.au ------------------------------------------------------------------------------ Sponsored by Intel(R) XDK Develop, test and display web and hybrid apps with a single code base. Download it for free now! http://pubads.g.doubleclick.net/gampad/clk?id=111408631&iu=/4140/ostg.clktrk _______________________________________________ BackupPC-users mailing list [email protected] List: https://lists.sourceforge.net/lists/listinfo/backuppc-users Wiki: http://backuppc.wiki.sourceforge.net Project: http://backuppc.sourceforge.net/
