On Mon, Oct 11, 2010 at 09:02:25AM -0400, Greg Wooledge wrote:
> Also, if you truly must have them all in one invocation, it may
> be better to fail and get a nice friendly error, than to have the
> command run with only a subset of the files.
Good point, thanks.
On Mon, Oct 11, 2010 at 02:56:50PM +0200, Sven Mascheck wrote:
> > If there is an absolute requirement to put *all* the files on a single
> > command, "-exec +" may fail to satisfy it. It might break up the files
> > into groups.
>
> Yes, but the same limit (ARG_MAX) gets hit, as soon as you call
On Mon, Oct 11, 2010 at 08:12:23AM -0400, Greg Wooledge wrote:
> On Sat, Oct 09, 2010 at 12:06:21AM +0200, Sven Mascheck wrote:
> > On Mon, Sep 20, 2010 at 09:14:15AM -0400, Greg Wooledge wrote:
> > > unset array
> > > while IFS= read -r -d '' f; do array+=("$f"); done \
> > > < <(find . -name '*
On Sat, Oct 09, 2010 at 12:06:21AM +0200, Sven Mascheck wrote:
> On Mon, Sep 20, 2010 at 09:14:15AM -0400, Greg Wooledge wrote:
> > unset array
> > while IFS= read -r -d '' f; do array+=("$f"); done \
> > < <(find . -name '*.c' -print0)
> > vi "${arr...@]}"
> As you mention -exec yourself, what