On 19/05/2021 14:48, fk+bacula--- via Bacula-users wrote:
Hi there,
I do running a daily incremental backup with a file set of multiple
directories.
The daily incremental backup time fluctuates between 4 and 16 hours,
where are 3 between 10 GB of data collected.
For optimizing the process it would be nice to find out, how many time
the FD has taken for each directory and how many files and size are send
to the SD.
An useful hints to get monitored this?
Thanks for any help, Frank
Fiddling with the code in the file daemon for this probably won't happen
unless there's a massive outcry for it, but you can get the list of
files by running the equivalent file search on the system at the same time.
Something like "find / -mtime -1 -o -ctime -1" would be a starting point.
Asking the people who are generating the data, "Why is it so?", might
also be helpful.
Cheers,
Gary B-)
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users