Hi All,
Sorry for the length here.
I'm trying to move out of the experimental stage with bacula and have
hit a snag. I can not restore data backed up from remote clients.
(Though I can restore data backed up from the host system with no
problem.) By the volume of data claimed to have been moved into the
backup archive, it looks like data are being pulled in from the
client machines successfully. And, the client files all show up in the
catalog when setting up a restore. I have 3 machines, each with a
different file set in bacula-dir.conf.
My test setup is a Linux bacula master (Gentoo 2.2.16 r2), a Linux
remote client (FC4 2.6.17.2139 x86_64), and a Mac OSX (10.3.9 (pre ACL))
remote client. I'm running Bacula version 1.38.11, compiled from source,
with the bacula-fds also compiled from source on the client machines.
Restores of the host machine to itself work as expected. Restores
of the clients progress to the point of displaying messages
similar to:
16-Aug 14:25 master-dir: Start Restore Job Restore-master.2006-08-16_14.25.01
16-Aug 14:25 master-sd: Ready to read from volume "TOPPDataBack0002" on
device "FileStorage" (/mnt/bigdisc0/TOPPDataBack)
Where all the references are correct. However, at that point,
remote restores just sit forever, I assume waiting for something.
Eventually I'll get a message containing something like:
16-Aug 14:18 master-dir: Restore-whaleshark.2006-08-16_14.02.17 Fatal
error:
Network error with FD
during Restore: ERR=No data available
16-Aug 14:18 master-dir: Restore-whaleshark.2006-08-16_14.02.17 Fatal
error:
No Job status
returned from FD.
16-Aug 14:18 master-dir: Restore-whaleshark.2006-08-16_14.02.17 Error:
Bacula
1.38.11 (28Jun06):
16-Aug-2006 14:18:40
JobId: 149
Job: Restore-whaleshark.2006-08-16_14.02.17
Client: master-fd
Start time: 16-Aug-2006 14:02:19
End time: 16-Aug-2006 14:18:40
Files Expected: 627
Files Restored: 0
Bytes Restored: 0
Rate: 0.0 KB/s
FD Errors: 0
FD termination status: Error
SD termination status: Canceled
Termination: *** Restore Error ***
This is the case no matter how I adjust the restore using "mod"
includeing whether I try to make the restore to the host machine
or the remote client machines.
Any thoughts on what I may be messing up would be appreciated!
I've pasted a goodly portion of my bacula-dir.conf below for
anyone that may want to skim through it for any glaring errors.
Thanks!
Alan Swithenbank
Database Programmer
Hopkins Marine Station
Pacific Grove, California
[EMAIL PROTECTED]
Director { # define myself
Name = master-dir
DIRport = 9101 # where we listen for UA connections
QueryFile = "/etc/bacula/query.sql"
WorkingDirectory = "/var/bacula"
PidDirectory = "/var/run"
Maximum Concurrent Jobs = 1
Password = # Console password
Messages = Daemon
}
#====== Job Definitions:
JobDefs {
Name = "DefaultJob"
Type = Backup
Level = Incremental
Client = master-fd
FileSet = "Master Set"
Schedule = "QuarterlyCycle"
Storage = File
Messages = Standard
Pool = Default
Priority = 10
}
JobDefs {
Name = "MacrocephalusJob"
Type = Backup
Level = Incremental
Client = macrocephalus-fd
FileSet = "Macrocephalus Set"
Schedule = "QuarterlyCycle"
Storage = File
Messages = Standard
Pool = Default
Priority = 10
}
JobDefs {
Name = "WhalesharkJob"
Type = Backup
Level = Incremental
Client = whaleshark-fd
FileSet = "Whaleshark Set"
Schedule = "QuarterlyCycle"
Storage = File
Messages = Standard
Pool = Default
Priority = 10
}
#====== Jobs:
Job {
Name = "master"
JobDefs = "DefaultJob"
Write Bootstrap = "/var/bacula/master.bsr"
}
Job {
Name = "macrocephalus"
Client = macrocephalus-fd
JobDefs = "MacrocephalusJob"
Write Bootstrap = "/var/bacula/macrocephalus.bsr"
}
Job {
Name = "whaleshark"
Client = whaleshark-fd
JobDefs = "WhalesharkJob"
Write Bootstrap = "/var/bacula/whaleshark.bsr"
}
Job {
Name = "BackupCatalog"
JobDefs = "DefaultJob"
Level = Full
FileSet="Catalog"
Schedule = "QuarterlyCycleAfterBackup"
# This creates an ASCII copy of the catalog
RunBeforeJob = "/etc/bacula/make_catalog_backup bacula bacula"
# This deletes the copy of the catalog
RunAfterJob = "/etc/bacula/delete_catalog_backup"
Write Bootstrap = "/var/bacula/BackupCatalog.bsr"
Priority = 11 # run after main backup
}
Job {
Name = "Restore-master"
Type = Restore
Client=master-fd
FileSet="Master Set"
Storage = File
Pool = Default
Messages = Standard
Where = /mnt/bigdisc1/bacula-restores
}
Job {
Name = "Restore-macrocephalus"
Type = Restore
Client=macrocephalus-fd
Bootstrap = "/var/bacula/macrocephalus.bsr"
FileSet="Macrocephalus Set"
Storage = File
Pool = Default
Messages = Standard
Where = /tmp/bacula-restores
}
Job {
Name = "Restore-whaleshark"
Type = Restore
Client=whaleshark-fd
Bootstrap = "/var/bacula/whaleshark.bsr"
FileSet="Whaleshark Set"
Storage = File
Pool = Default
Messages = Standard
Where = /tmp/bacula-restores
}
#====== File Sets:
# List of local files to be backed up
FileSet {
Name = "Master Set"
Include {
Options {
Compression=GZIP
signature = MD5
}
File = /TOPP
File = /Data
File = /cpdata
File = /postgres
File = /argos
}
}
FileSet {
Name = "Macrocephalus Set"
Include {
Options {
Compression=GZIP
signature = MD5
}
File = /usr/sbin
}
}
FileSet {
Name = "Whaleshark Set"
Include {
Options {
Compression=GZIP
signature = MD5
}
File = /usr/sbin
}
}
FileSet {
Name = "Catalog"
Include {
Options {
Compression=GZIP
signature = MD5
}
File = /var/bacula/bacula.sql
}
}
#====== Schedules:
Schedule {
Name = "QuarterlyCycle"
Run = Full June October February 3rd sun at 23:05
Run = Differential June August October December February April 4th sun at
23:05
Run = Incremental sun-sat at 23:45
}
# This schedule does the catalog. It starts after the QuarterlyCycle
Schedule {
Name = "QuarterlyCycleAfterBackup"
Run = Full sun-sat at 23:55
}
#====== Clients:
Client {
Name = master-fd
Address = mola.stanford.edu
FDPort = 9102
Catalog = MyCatalog
Password = # password for FileDaemon
File Retention = 6 months # six months (for catalog, not archive)
Job Retention = 12 months # 12 months (for catalog, not archive)
AutoPrune = yes # Prune expired Jobs/Files
}
Client {
Name = macrocephalus-fd
Address = macrocephalus.stanford.edu
FDPort = 9102
Catalog = MyCatalog
Password = # password for FileDaemon 2
File Retention = 30 days # 30 days
Job Retention = 6 months # six months
AutoPrune = yes # Prune expired Jobs/Files
}
Client {
Name = whaleshark-fd
Address = whaleshark.stanford.edu
FDPort = 9102
Catalog = MyCatalog
Password = # password for FileDaemon 2
File Retention = 30 days # 30 days
Job Retention = 6 months # six months
AutoPrune = yes # Prune expired Jobs/Files
}
#====== Storage Devices:
Storage {
Name = File
# Do not use "localhost" here
Address = mola.stanford.edu # N.B. Use a fully qualified name here
SDPort = 9103
Password =
Device = FileStorage
Media Type = File
}
#====== Catalogs:
# Generic catalog service
Catalog {
Name = MyCatalog
dbname = bacula; user = bacula; password = "xservbackup;;;"
}
#====== Pools:
# Default pool definition
Pool {
Name = Default
Pool Type = Backup
Recycle = yes # Bacula can automatically recycle
Volumes
AutoPrune = yes # Prune expired volumes
Volume Retention = 365 days # one year
Accept Any Volume = yes # write on any volume in the pool
LabelFormat = "TOPPDataBack" # base for automatic volume labeling
}
-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Bacula-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/bacula-users