Bug in Bash on syntax error in functions?
Hi! I am using GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu). Here is what looks to be a bug: -- $ cat testfail1 #!/bin/bash echo 'a' fail_command echo 'b' $ cat testfail2 #!/bin/bash echo 'a' echo "$[ 1 + ]" echo 'b' $ cat testfail3 #!/bin/bash function1(){ echo 'a' fail_command echo 'b' } function1 echo "exit: $?" $ cat testfail4 #!/bin/bash function1(){ echo 'a' echo "$[ 1 + ]" echo 'b' } function1 echo "exit: $?" $ ./testfail1 a ./testfail1: line 3: fail_command: command not found b $ ./testfail2 a ./testfail2: line 3: 1 + : syntax error: operand expected (error token is "+ ") b $ ./testfail3 a ./testfail3: line 4: fail_command: command not found b exit: 0 $ ./testfail4 a ./testfail4: line 4: 1 + : syntax error: operand expected (error token is "+ ") exit: 1 $ -- Here is what is inconsistent: 1) testfail4 performs differently from testfail3 in that the function immediately terminates when a syntax error is found, even though both scripts have a failing command on the same line. Note testfail3 continues to execute whereas testfail4 does not, even in the case of nested functions [1] 2) testfail4 performs differently from testfail2 in that when the syntax error is found a function, that function terminates, whereas if the same syntax error is in the main part of the script, that script continues to execute [1] Nested functions example: same outcome; all functions are immediately terminated -- $ cat testfail5 #!/bin/bash function2(){ function1 echo 'c' } function1(){ echo 'a' echo "$[ 1 + ]" echo 'b' } function2 echo "exit: $?" $ ./testfail5 a ./testfail5: line 8: 1 + : syntax error: operand expected (error token is "+ ") exit: 1 -- Please confirm that this is a bug and if another ticket is needed. Thank you Bug fix proposal would be to treat functions in the same way as the main script; always just keep executing, no matter what error (syntax or command not found or any other). This was worth a few hours of debugging in a large script running in a test server that oddly 'skipped code' when there was a calculation error, with the skipped code creating some undefined behaviour. Thank you
Re: Bug in Bash on syntax error in functions?
On 4/8/22 2:42 AM, Roel Van de Paar via Bug reports for the GNU Bourne Again SHell wrote: Hi! I am using GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu). Here is what looks to be a bug: It's not. There are a couple of misconceptions here. Let's go through them. -- $ cat testfail1 #!/bin/bash echo 'a' fail_command echo 'b' First, command not found is not a shell syntax error or word expansion error. It simply causes $? to be set to 127. $ cat testfail2 #!/bin/bash echo 'a' echo "$[ 1 + ]" echo 'b' Next, this is not a shell syntax error either, it is a word expansion error (the error is that the arithmetic expression is missing an operand; the word expansion is the obsolete $[ ... ]). $ cat testfail3 #!/bin/bash function1(){ echo 'a' fail_command echo 'b' } function1 echo "exit: $?" $ cat testfail4 #!/bin/bash function1(){ echo 'a' echo "$[ 1 + ]" echo 'b' } function1 echo "exit: $?" These simply wrap the above in a brace group command that is the shell function body. So the difference is between a command not found $ ./testfail1 a ./testfail1: line 3: fail_command: command not found b $ ./testfail2 a ./testfail2: line 3: 1 + : syntax error: operand expected (error token is "+ ") b and a word expansion error. Command-not-found errors don't really have any effect on execution unless you have the `-e' option set. Word expansion errors cause the shell to abort the currently-executing command (in this case, a simple command) and return to the top level read-execute loop. (And before someone pipes up here, POSIX requires a non-interactive shell to exit immediately on a word expansion error, which bash does in posix mode.) $ ./testfail3 a ./testfail3: line 4: fail_command: command not found b exit: 0 $ ./testfail4 a ./testfail4: line 4: 1 + : syntax error: operand expected (error token is "+ ") exit: 1 $ -- Here is what is inconsistent: 1) testfail4 performs differently from testfail3 in that the function immediately terminates when a syntax error is found, even though both scripts have a failing command on the same line. Note testfail3 continues to execute whereas testfail4 does not, even in the case of nested functions [1] The word expansion error causes the shell to abort execution of the current command (the group command that is the function body) and return to the top level read-execute loop. The command-not-found error isn't a shell error and does not have that effect. 2) testfail4 performs differently from testfail2 in that when the syntax error is found a function, that function terminates, whereas if the same syntax error is in the main part of the script, that script continues to execute See above. [1] Nested functions example: same outcome; all functions are immediately terminated Same. This was worth a few hours of debugging in a large script running in a test server that oddly 'skipped code' when there was a calculation error, with the skipped code creating some undefined behaviour. This is not a `calculation error'; this is a word expansion error. -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Bug in Bash on syntax error in functions?
On Fri, Apr 8, 2022, at 3:23 PM, Chet Ramey wrote: > So the difference is between a command not found > >> $ ./testfail1 >> a >> ./testfail1: line 3: fail_command: command not found >> b >> $ ./testfail2 >> a >> ./testfail2: line 3: 1 + : syntax error: operand expected (error token is >> "+ ") >> b > > and a word expansion error. Command-not-found errors don't really have any > effect on execution unless you have the `-e' option set. Word expansion > errors cause the shell to abort the currently-executing command (in this > case, a simple command) and return to the top level read-execute loop. Notably, these don't execute "echo b" either, demonstrating that this isn't actually about functions at all. { echo a echo "$[ 1 + ]" echo b } echo a; echo "$[ 1 + ]"; echo b > (And before someone pipes up here, POSIX requires a non-interactive shell > to exit immediately on a word expansion error, which bash does in posix > mode.) *pipes* :) Consequently, a modernized testfail2 using $((...)) notation doesn't execute "echo b" when run with ksh, dash, yash, or POSIX-mode bash (or even native-mode zsh). -- vq
Re: Bug in Bash on syntax error in functions?
On 4/8/22 4:33 PM, Lawrence Velázquez wrote: Notably, these don't execute "echo b" either, demonstrating that this isn't actually about functions at all. I may have been too obscure saying the function body was a brace group command. { echo a echo "$[ 1 + ]" echo b } -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Bug in Bash on syntax error in functions?
Understood. Thank you both. On Sat, Apr 9, 2022 at 6:46 AM Chet Ramey wrote: > On 4/8/22 4:33 PM, Lawrence Velázquez wrote: > > > Notably, these don't execute "echo b" either, demonstrating that > > this isn't actually about functions at all. > > I may have been too obscure saying the function body was a brace group > command. > > > > > { > > echo a > > echo "$[ 1 + ]" > > echo b > > } > > > -- > ``The lyf so short, the craft so long to lerne.'' - Chaucer > ``Ars longa, vita brevis'' - Hippocrates > Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/ >
Re: Bash regexp parsing would benefit from safe recursion limit
willi1337 bald writes: > A deeply nested and incorrect regex expression can cause exhaustion of > stack resources, which crashes the bash process. Further, you could construct a deeply nested regex that is correct but would still crash the process. It's hard to define what should happen in a way that is implementable -- there are innumerable programs that are theoretically correct but exhaust the stack if you try to execute them. More or less what you want is some sort of "checkpoint" of the status of the overall computation (shell process) that you would return to. (a continuation!) Your suggestion is effectively that the checkpoint is "when bash prompts for the next command". But in a sense, that's what crashing the process is, too -- you return to the checkpoint "before you started the shell". If you had to worry about this in practice, you'd turn $ command1 $ command2 $ command3 into $ bash -c command1 $ bash -c command2 $ bash -c command3 Dale