Hi, Re your 10000 files.
Here is a quick answer in two parts Part 1 - for an individual file, you can search and replace text by cat <filename> |perl -e 'while (<>) { $_=~ s/http://209.155.163.97/smile.gif/http://www.cproda.com/smile.gif/g; print $_; }' (All on one line) Part 2. Now you need to pump all your files through it. If you really have 10000 files then a foreach or similar loop wont work, because the line will be too long. (If you were exaggerating, and only have 100 or so then there are shell commands that will loop over all the files. Take a look at the foreach command under csh or tcsh. There is a similar command under bash, but I forget what it is called. - very handy stuff) When I have that many files to process, I use awk to write a script that will then process them. Like so: first make a place to stick all the new files mkdir newfiles Now write a little script... ls | awk '{printf( "cat %s | perl -e '"'"'while (<>) { $_=~ s/http://209.155.163.97/smile.gif/http://www.cproda.com/smile.gif/g; print $_; }'"'" >newfiles/%s\n",$1,$1);}' > script.sh Now run the script... source script.sh IMPORTANT NOTE: I just cranked this out off the cuff, tried it on a little test file and it seemed to work. I may not - probably don't - have it exactly right, but you should be able to take it from here. Make a backup, and test before trusting it. Clearly I spend way too much of my life frigging around in the shell... You may find it easier to put the perl commands in a file, and the awk commands in a file and then get them right there, rather than slapping it out on a command line, escaped quotes and all. Cheers Rupert