I've done restores of ~50TB (~3,500,000 files) with v5.0.3 under ubuntu
10.04LTS against sqlite3 databases here with no problems (taking minutes to
create the tree and to do a mark *). I'm running on a dual cpu X5680 system w/
24GB ram if that helps with a data point.
-----Original Message-----
From: ted [mailto:[email protected]]
Sent: Friday, June 24, 2011 03:29 PM
To: [email protected]
Subject: [Bacula-users] restore of large data set takes extremely long to
build dir tree,
Hi,
I have an issue when trying a restore of a large data set consisting of about
6.5TB and 35,000,000 files the console takes an extremely long time to build
the directory tree, over an hr. After the tree is built I typed "mark *" and
this command ran for about 18 hours before I hit ctrl-c; the mySQL server
showed no activity and neither bconsole nor bacula-dir was not using any CPU,
so I believe the process just petered out. I was wondering if anyone has been
successful in restoring large TB data sets. Either ones containing a large
number of individual files or ones that are 6TB or larger with small file lists
and contain small number of large files?
A little more system info;
bacula-dir Version: 5.0.3 (director and storage reside on separate systems)
OS FreeBSD 8.2-RELEASE #0
Thanks, in advance for any shared results!
------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense..
http://p.sf.net/sfu/splunk-d2d-c1
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users