On Mon, Oct 11, 2010 at 08:12:23AM -0400, Greg Wooledge wrote: > On Sat, Oct 09, 2010 at 12:06:21AM +0200, Sven Mascheck wrote: > > On Mon, Sep 20, 2010 at 09:14:15AM -0400, Greg Wooledge wrote: > > > unset array > > > while IFS= read -r -d '' f; do array+=("$f"); done \ > > > < <(find . -name '*.c' -print0) > > > vi "${arr...@]}" > > > As you mention -exec yourself, what about simply > > > > find . -type f -name '*.c' -exec sh -c 'vi "$@"' find-sh {} + > > > find . -type f -name '*.c' -exec vi {} + > > If there is an absolute requirement to put *all* the files on a single > command, "-exec +" may fail to satisfy it. It might break up the files > into groups.
Yes, but the same limit (ARG_MAX) gets hit, as soon as you call an external command, like "vi "${arr...@]}" in the example above. If you can get by without any external command, then avoiding ARG_MAX is a noteworthy advantage of shell-only handling (loops + variables).