On Sat, 14 Oct 2023 at 06:33, Robert Elz <k...@munnari.oz.au> wrote: > The issue we have (which possibly might be similar in bash, but only > possibly - but it would explain the symptoms) is that when one does > > VAR=value command > > "VAR" is essentially made a local variable for command, so its value > in the outlying environment is unchanged by the assignment of value. > ...
> But when the command is a function, or a shell builtin (or the '.' > command, which is almost both of those) then we have some strange > effects. > ... > But that's wrong, all "VAR=foo command" is supposed to do, is to put > VAR into the environment of command, without altering it in the shell > environment that is executing command. If command is a function, or > a '.' script (or a shell builtin, which was the context in which I > first considered this issue) which alters VAR, the global VAR should > be altered > Respectfully I must disagree. This aspect of Bash's behaviour has a very long historical precedent. Back when I used the Bourne Shell we didn't have `local`, so we used to write `var= func` to make sure that `func` couldn't mess with *our* `var`. Given that "put in the environment" actually means "create a shell variable and mark it as exported", it's difficult to see how "only put into the environment but don't make it a local variable" could work without making the semantics even more contorted and confusing. It seems to me that what's needed is a new model for variables, where the entire scope chain can be inspected and modified where necessary, and where the existing declare/local/export/typeset and unset are simply shorthands for more comprehensive operations.