Follow-up Comment #17, bug #60383 (project findutils): > it seems just to ensure that the next argv[i] is treated as a starting point > (regardless whether it looks like an option or not). > Furthermore, -f seems to change the handling of the '--' argument. > > This seems to be a solution for the 80% case, but if one has file names > like '--', then ...
It doesn't change the handling of --, the reference to -- in the BUGS section of the manual is a bit misleading. -f is an option there, like -H or -L, except it's an option that takes arguments, it's not a predicate. So it has to be on the left hand side of --. As -f takes a required argument (as -ffile or -f file as usual with getopt()), it will handle any arbitrary string including --. What follows is taken as the argument to -f whatever it is, whether it starts with - or looks like a predicate. It's like for grep. grep -F "$string" won't work properly for values of $string that start with -. You need grep -F -e "$string", where "$string" is the argument to the -e option. With grep, you can also use grep -F -- "$string", where "--" marks the end of options. But using -- doesn't help with find, because while "--" does mark the end of options, meaning that -L, -H, -f... can't be used afterwards, arguments starting with - (and [!()]) still cause problems as after that --, find still expects either file names or predicates, so -- doesn't help at all for find. Note that I was not suggesting GNU find implements the -f option *instead* of the -files0-from predicate, just to add it for compatibility with BSDs and so we can eventually have a portable way to pass arbitrary file names to find. I agree -files0-from has advantages over -f: - the list can be fed slowly and find can start processing it as it comes - no need to allocate and pass around a potentially huge list - the list is not exposed in the output of ps - not affected by E2BIG execve() limit -f has advantages too though: - for the very common case of invoking find on one directory, find -f "$dir" ... is less cumbersome than find -files0-from <(printf '%s\0' "$dir") - stdin is preserved (compared to using cmd | find -files0-from -) - cmd | find -files0-from - is potentially dangerous if cmd ends up being aborted for instance because it reaches some resource limits. For instance it was going to output /tmp/x/foo\0/tmp/x/bar\0 but got aborted just after it wrote /tmp/x/foo\0/, find will end up searching / and if it was with -mtime +30 -delete for instance, that could have disastrous consequences, maybe find should only consider fully delimited records. For the E2BIG limit, note that BSDs support: printf -- '-f%s\0' "${large_list[@]}" | xargs -r0 -J % find % ... (here assuming a builtin printf) That -J would be a welcome addition to GNU xargs as well. _______________________________________________________ Reply to this item at: <https://savannah.gnu.org/bugs/?60383> _______________________________________________ Message sent via Savannah https://savannah.gnu.org/