On Sun, Sep 22, 2002 at 12:29:08PM +0100, Robin Cragg wrote:
>
> $MAXSIZE = 5000000
> $size = 0;
> @Zip_Now = ();
> $a=1;
>
> foreach (@Files_to_zip) {
> $size += (stat $_)[7];
> if ($size > $MAXSIZE) {
> exec "tar -vr -T @Zip_Now -f $tar_file$a";
> # then burn this to CD
> @Zip_Now = ();
> $a++;
> }
> push (@Zip_Now, $_;
>
> }
> exec "tar -vr -T @Zip_Now -f $tar_file";
>
>
>
Thanks. I also made a mistake when I typed "tar -vr -T", since the -r
means add to the existing tar file, which in this case we don't want to
do.
Your solution is so simple, I can't believe I didn't think of it myself.
I was thinking of something really complicated.
Thanks again!
Paul
>
> Hi Pual,
>
> I think this will do the trick...
>
>
> $MAXSIZE = 5000000
> $size = 0;
> @Zip_Now = ();
>
> foreach (@Files_to_zip) {
> $size += (stat $_)[7];
> if ($size > $MAXSIZE) {
> exec "tar -vr -T @Zip_Now -f $tar_file";
> # then burn this to CD
> @Zip_Now = ();
> } else {
> push (@Zip_Now, $_;
> }
> }
> If (scalar @Zip_Now) {
> exec "tar -vr -T @Zip_Now -f $tar_file";
> }
>
>
>
> If you want to background the zip or the burn, then just for you script.
> If you use IDE disks though, it may not be a great idea to run a burn
> and a zip at the same time, as you are likely to get buffer underruns in
> your CD burn.
>
>
> R
>
> -----Original Message-----
> From: Paul Tremblay [mailto:[EMAIL PROTECTED]]
> Sent: 21 September 2002 22:48
> To: [EMAIL PROTECTED]
> Subject: background process
>
>
> I am writing a script in perl to backup my system, and would like to run
> a backgroud process to burn each CD as it is created.
>
> Right now I use this command
>
> my $fail=system "tar -vr -T $files_to_back -f $tar_file";
>
> to create a tar file. If the tar file is bigger than 650 M, then I will
> have to use split to split it into chunks. Needless to say, if backing
> up my whole hardrive, I will have many chunks. In fact, if my hardrive
> contains 10 G of info, I would need 10G of extra space just to run my
> script.
>
> So I want to create a background process. (I believe this is what I have
> to do, anyway.) I want tar to create 650M of info, and then stop while I
> create a disk image, burn the image, and then remove the image.
>
> I have looked in *Perl Cookbook,* but I couldn't really find any way to
> do this.
>
> I believe doing what I want is possible. There is a relativley simple
> script called backuponcd that does just this. But the script is written
> as a bash script, and I can't quite figure out what is going on.
>
> Thanks
>
> Paul
>
> PS I feel like I am re-inventing the wheel. I am sure there are a
> million good scripts and programs out there to backup. But I either
> can't get them to run, or they don't quite offer quite the ability to
> customize that I want.
>
> I would like the ability to append new files to old ones.
> For example, if I am working on a document called "my_story.txt", I will
> edit this story every day for several weeks. I want each version to be
> on a CD--in other words, there would be 21 copies of this story if I
> edited every day for three weeks. After all, I might do some bad editing
> on day 18 and really wish that I had a copy of the story that I did on
> day 15.
>
>
> Anyone know of a *well-documented* perl script that does what I want?
>
> --
>
> ************************
> *Paul Tremblay *
> *[EMAIL PROTECTED]*
> ************************
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>
>
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>
>
>
> --
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
--
************************
*Paul Tremblay *
*[EMAIL PROTECTED]*
************************
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]