Rob Hudson <[EMAIL PROTECTED]> wrote: >I wrote a little perl script [1] that gets the names of the files from >a tarball, then removes all the files and directories found inside the >tarball. > >It comes in real handy when a tar archive dumps into the currently >directory and makes a big mess. Of course you can untar in a temp dir >or use the 't' option to look inside first, but sometimes I put too >much trush in where the tarballs are going to dump and get screwed. > >[1] http://www.cogit8.org/download/tarball-clean.txt
I'd rather you used Perl's built-in unlink() and rmdir() functions; using system() might end up calling the shell, since filenames in tarballs can have shell metacharacters in them. They can also have spaces, which will confuse your current script. It'll be an awful lot faster if you don't have to fork either one or two new processes for each entry in the tarball, too. You need to change the if() after the assignment to $tarball to also check the length of @ARGV: "if (@ARGV && $ARGV[0] ne '')". Finally, you should check the return codes of calls to the operating system: 'unlink $file or die "Couldn't unlink $file: $!";'. Other than that, it looks like a handy script. Cheers! -- Colin Watson [EMAIL PROTECTED]