Unfortunately I can't indeed send you the directory names, and don't
know a way to create an anonimized directory list. But attached is a
script which should create a similar structure: 420 directories at all
(in multiple levels), and 280000 files in the lowest directories. The
files are only 13 bytes each (1.1 GB at all), while the original files
were around 2 kb each I think... But I guess the most important factor
is the number of files and dirs.

I will run sbackup on this directory hierarchy tonight and see how long
it takes and if this faithfully reproduces the problem.

Regarding profiling: isn't there some very simple profiling module for
starting? Just to see roughly where the time is spent? Maybe even some
fine-grained debug messages (more like trace messages) with timestamps
could be useful.


** Attachment added: "Perl script to create a huge directory struture"
   http://librarian.launchpad.net/7400684/createdirs.pl

-- 
sbackup is very slow when working on many files
https://bugs.launchpad.net/bugs/102577
You received this bug notification because you are a member of Ubuntu
Bugs, which is the bug contact for Ubuntu.

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to