On Wed, 23 May 2007, Marc wrote:
> I've searched the archives, but I can not find any relevant information.
> Therefor my question: is it possible to do a database by database dump and
> backup these dump files? Because of database sizes, it would be very nice if
> this can be done database by database. I mean first do the dump of database
> 1, move it to bacula, remove the dump, dump of database 2, move it to
> bacula, etc...
I would do this in a shell script, instead of just trying to schedule it
within bacula. Something like:
---BEGIN---
#! /bin/bash
for THING in `mysql -u$USER -p$PASSWORD -e 'show databases;`
do mysqldump -u$USER -p$PASSWORD -l $THING > /path/to/temp/$THING.sql
/etc/bacula/bconsole -c /etc/bacula/bconsole.conf << END_OF_DATA
run job=database yes
END_OF_DATA
rm -rf /path/to/temp/$THING.sql
done
---END---
That's quick and dirty, and you'd certainly want to test it, but I think
it illustrates the point. You'd have to create a job called 'database',
with a fileset that pointed to the directory you're dumping the files to,
and you might want to think about dumping tables individually through gzip
to save space. If there are databases that don't need to be backed up (the
'mysql' and 'lost+found' dbs come to mind), you might need to do some
parsing of the 'show databases' output (and I'm not sure how to supress
the output of the column name).
I have a single 650GB MySQL 4.1 db that I do a full backup of every two
weeks. Takes about 19 hours to dump all of the tables and gzip them, and
the backups end up around 36GB.
-- D
-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users