On Mon, Feb 15, 2016 at 9:44 PM, Marc Chantreux <kha...@phear.org> wrote: > I finally wrote > > a-very-long-list-of-files | perl -lnE 'print if -f && /[.]scm$/'
The solution I'd suggest for this constrained problem is: sed -e 's_^\([^/]\)_./\1_' < mylist | xargs -d '\n' sh -c 'find "$@" -regex ".*\.scm$" -type f' sh | sed -e 's_./__' I use -d in preference to -L because -L leaves quote-processing enabled. If -d is not available, you could use -L but you will need to worry about quote and EOF processing (since the xargs you're using may default to having an EOF string, which is annoying but POSIX-compliant). I use -regex in preference to -name because there is no precomputation for fnmatch patterns in GNU find (yet...) so -regex is slightly faster. If we can relax the constraints (in the way the problem is set up) a bit, I'd suggest using a NUL-terminated list of file names, since file names can contain newlines. Sometimes it will also be a win to sort the input (to better take advantage of directory entry caches). Quite honestly though, I'd say that using Perl is a perfectly fine way to solve this problem. > this isn't a problem on a BSD system as find has the wonderfull -f. > According to http://netbsd.gw.com/cgi-bin/man-cgi?find+1+NetBSD-current > > -f Specifies a file hierarchy for find to traverse. > File hierarchies may also be specified as the > operands immediately following the options. > > It would be nice to get this feature into the gnu tools. I hope you like > the idea. AFAICS though, this strategy only works well if you are willing to invoke a separate instance of find for item in that very long list. That seems inefficient. This strategy also doesn't work for path names containing newlines. Thanks for the suggestions. James.