> That also depends on a few other factors like .e.g. "env headroom', see > https://git.sv.gnu.org/cgit/findutils.git/tree/xargs/xargs.c#n427 > from line 427 until line 510. > > I don't have Mac OS X, so I can't step through what's the limiting factor > in your case. > > What's your actual concern? I mean, depending on what COMMAND does, the > overhead of calling ~10 instead of 1 process in the ideal GNU/Linux case > for exactly that amount of arguments still looks okay to me.
I'd like to avoid external command like xargs as much as possible because I want to relieve the burden of users in thinking how slow a script is. Also, in the following example, if I keep adding env variables, even `xargs --help` can not be called. However, the environment variables can still be accessed in bash. This means that even xargs could fail. So it is not a reliable way to use xargs to figure out the limit. if there is a shell native way to know limit? $ cat main.sh #!/usr/bin/env bash # vim: set noexpandtab tabstop=2: echo 10000 for i in {1..10000}; do eval "export x$i=1"; done xargs --show-limits --no-run-if-empty < /dev/null echo echo 20000 for i in {10001..20000}; do eval "export x$i=1"; done xargs --help echo declare -p x20000 $ ./main.sh 10000 Your environment variables take up 85520 bytes POSIX upper limit on argument length (this system): 174576 POSIX smallest allowable upper limit on argument length (all systems): 4096 Maximum length of command we could actually use: 89056 Size of command buffer we are actually using: 131072 Maximum parallelism (--max-procs must be no greater): 2147483647 20000 ./main.sh: line 11: /usr/local/bin/xargs: Argument list too long declare -x x20000="1" -- Regards, Peng