Generally, using git is a bad idea for backups (from what I've read) git stores it's data uncompressed and inefficiently. If you are backing up things like configuration files or web pages that can change a lot, sure, but for storing binary files with git, I'd recommend against it, since binaries vary greatly from version to version (unlike text files) and you'd just accumulate tons of useless binaries. programs like duplicity and rsync are great for backups, though.
in all, the drawbacks outweigh the benefits of using a code management tool to back up entire systems... On Tue, 2011-11-01 at 23:16 +0200, Andrey Utkin wrote: > Hi all! Long live the gentoo masters! > I'd like to hear from anybody who uses (or tried) git on production > servers for saving the points of possible restore. Please, share your > practices, like commit patterns, .gitignore contents, etc. I've begun > to use it a couple of days ago for that, and pointed out some issues. > I control the whole root fs with git. > The problematic part is bunch of files that update frequently, but i > am not familiar with them and i'm not sure if system will load without > them. > Namely, these are files in /usr/lib64/portage/pym/ > Also wtmp, utmp files hurt - likely without them box won't boot, but > they shouldn't be in git control, too, coz they update often. > Thus, backup restoring requires not git repo only, but also some tar of base? >