Re: minor language RFE(s)

2015-10-08 Thread Andreas Schwab
Linda Walsh  writes:

> If I am using a var as an index in a for loop
> as in:
>
>   for((i=0; i<10; ++i)); do : done
>
> or 2) as an iterator as in
>
>   for i in {1..10}; do : done
>
> **and** such usage is in a function,
>
> the default is to promote 'i' to 'global' status,

This behaviour is shared with all uses of shell variables.

> which is usually not needed nor desired (that doesn't mean
> there aren't cases where one wants it global,
> but I'm referring to the most common case).
>
> The common workaround is to put the onus on the user
> of 'i' to first declare it as local.  That's not easily
> changed w/o potential chaos... however,
>
> I was thinking ... lets say we had 1 or 2 abbreviation
> keywords, at least 1 being "int=declare -i",
> and ease-of-use "my=declare"
>
> that could then allow the "declare" of the 'for' iterator
> as local, in-line.
>
> i.e. instead of predeclaring them w/'declare -i' or 'declare'
> one could write:
>
>   for((int i=0; i<10; ++i)); do : done
>
> or 2)
>
>   for int i in {1..10}; do : done
>   for my i in {a..z}; do : done

If you want perl you know where to get it.

Andreas.

-- 
Andreas Schwab, SUSE Labs, sch...@suse.de
GPG Key fingerprint = 0196 BAD8 1CE9 1970 F4BE  1748 E4D4 88E3 0EEA B9D7
"And now for something completely different."



Re: extglob syntax error in function definition

2015-10-08 Thread Greg Wooledge
On Wed, Oct 07, 2015 at 10:44:20PM -0500, Eduardo A. Bustamante López wrote:
> > Repeat-By:
> > shopt -u extglob
> > isnum () ( shopt -s extglob; case "$1" in  [1-9]*([0-9])) return 0 ;; 
> > *) return 1 ;; esac; )
> 
> Remember that bash parses and interprets the script line-by-line. If you want
> to change the parser's operation (for example, have it recognize the extglob
> patterns), you have to do it in a different line than where you're using the
> special syntax.

Even more: bash parses an entire function all at once.  If you want
extglob syntax to be permitted inside a function, the extglob option
must be turned on BEFORE the function is parsed.  You can't flip it
inside a function.

The normal recommendation is that you should put shopt -s extglob
right at the top of your script, directly beneath the shebang.

#!/usr/local/bin/bash
shopt -s extglob

That way extglob is enabled for the entire script, functions and all.
I'm not aware of any negative consequences for doing this.  In fact,
bash has a compile-time option to enable extglob.  This isn't the
default (yet), but perhaps some day it will be.



Re: extglob syntax error in function definition

2015-10-08 Thread Chet Ramey
On 10/6/15 8:27 AM, bash...@jonkmans.nl wrote:

> Bash Version: 4.3
> Patch Level: 30
> Release Status: release
> 
> Description:
>   The shell gives a syntax error when defining a function that uses the 
> extended pattern matching operators.

This is fundamental to how the shell works.  The shell always parses a
complete command before executing any of it.  A shell function definition
is a compound command, so the shell parses the entire function definition
at definition time rather than parsing it piece-by-piece on the fly at
execution time.  Since extglob changes parsing behavior to recognize the
extended pattern matching operators -- which are otherwise syntax errors --
it needs to be enabled before parsing the function definition.


>   I would have expected that i could encapsulate the setting of extglob, 
> by using a subshell-like function:
>   shopt -u extglob
>   isnum () ( shopt -s extglob; case "$1" in  [1-9]*([0-9])) 
> return 0 ;; *) return 1 ;; esac; )

It's not reasonable to expect the `shopt -s' to be executed as part of the
function definition, without running the function at all.

If you want to use the extended pattern matching syntax, you need to have
extglob enabled before you try to parse any commands using it.  That's
just, as I said, fundamental.

There is a compile-time option that enables extglob by default.

Chet

-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, ITS, CWRUc...@case.eduhttp://cnswww.cns.cwru.edu/~chet/



Re: minor language RFE(s)

2015-10-08 Thread Chet Ramey
On 10/7/15 7:38 PM, Linda Walsh wrote:

> I was thinking ... lets say we had 1 or 2 abbreviation
> keywords, at least 1 being "int=declare -i",
> and ease-of-use "my=declare"
> 
> that could then allow the "declare" of the 'for' iterator
> as local, in-line.
> 
> i.e. instead of predeclaring them w/'declare -i' or 'declare'
> one could write:
> 
>   for((int i=0; i<10; ++i)); do : done
> 
> or 2)
> 
>   for int i in {1..10}; do : done
>   for my i in {a..z}; do : done

These change the syntax of the shell in incompatible ways.  The
arithetic `for' command takes arithmetic expressions, not shell
commands, and the `for' command takes a name (identifier), not a
shell command.  Aside from any syntactic sugar (`int', `my'), these
are not consistent with how the shell grammar is formed, and this
isn't a good enough reason to change the grammar that dramatically.

-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, ITS, CWRUc...@case.eduhttp://cnswww.cns.cwru.edu/~chet/



Re: minor language RFE(s)

2015-10-08 Thread Linda Walsh



Chet Ramey wrote:
These change the syntax of the shell in incompatible ways. 

---
I think I had the feeling of ensuring compatibility as being important
and changing things in incompatible ways . "That's not easily changed w/o potential 
chaos..."


The arithetic `for' command takes arithmetic expressions, not shell
commands, and the `for' command takes a name (identifier), not a
shell command.  Aside from any syntactic sugar (`int', `my'), these
are not consistent with how the shell grammar is formed, and this
isn't a good enough reason to change the grammar that dramatically.

---
Yeah, I think I mentioned that case:

  I've no idea of the difficulty level to do this, but
  was thinking if not too difficult...  and if it is...
  well keep it on a pile of ideas if bash ever got
  refactored such that implementation became easier..?

I understand the problems of working with 10+ year old code
that's been patched through the roof and trying to add _anything_
to the design.  Thus the proposal of keeping the idea around
if bash was ever refactored such that implementing a change like
this wouldn't be a big deal


Andreas Schwab wrote: 

If you want perl you know where to get it.

---
Actually I have no idea where to get a version of perl
that could even process POSIX compatible shell-script, let
alone bash's extensions (not to mention having a mechanism
to write in perl as well).

Perhaps you could enlighten me.

There are things about perl that I don't fancy as well -- 
much of what I want to change in bash or perl involves 
reducing repetitive typing -- as in this minor language RFE,

but perl is way too fossilized with maintenance being dominated
by an even more conservative core team.

So to think that Perl might be extensible to allow for 
greater language efficiency or compatibility is very unlikely, as

they can't even keep new versions of perl backward compatible with
older versions.

Bash has several fairly good reasons for slow evolution -- one
maintenance of POSIX compatibility, and the fact that maintenance and
development are still being coordinated by the original author.
Compared to the perl case that has a constantly changing cast, no
standards to adhere to (though some have been published, they aren't 
followed), and a coordination effort that is reminiscent of trying

to herd cats.

It's obvious that suse is moving away from simplicity with most of
the commands to manage the system being 2-3 times as long as previously.
So I wouldn't expect language efficiency to rate high on your priority 
list.


However, consider this:  

"the number of bugs in code is proportional to the number of source 
lines, not the number of ideas expressed."  --  So being able to
express ideas in 1 line are maybe 2x more reliable than a 
language that forces splitting.


There are several articles on programming language conciseness the
benefits to the programmer of being able to express and write more
concepts in less space and faster time (APL, for example, is thrown
out as being to slow to program in due to needing a specialized 
character set -- so conciseness down to special symbols isn't considered

a good thing).

Besides doing your own search, I thought this article did a fairly
good job of comparing various factors:

http://redmonk.com/dberkholz/2013/03/25/programming-languages-ranked-by-expressiveness/

*cheers*
-l



Re: minor language RFE(s)

2015-10-08 Thread Chet Ramey
On 10/8/15 1:48 PM, Linda Walsh wrote:

>> The arithetic `for' command takes arithmetic expressions, not shell
>> commands, and the `for' command takes a name (identifier), not a
>> shell command.  Aside from any syntactic sugar (`int', `my'), these
>> are not consistent with how the shell grammar is formed, and this
>> isn't a good enough reason to change the grammar that dramatically.
> ---
> Yeah, I think I mentioned that case:
> 
>   I've no idea of the difficulty level to do this, but
>   was thinking if not too difficult...  and if it is...
>   well keep it on a pile of ideas if bash ever got
>   refactored such that implementation became easier..?
> 
> I understand the problems of working with 10+ year old code
> that's been patched through the roof and trying to add _anything_
> to the design.  Thus the proposal of keeping the idea around
> if bash was ever refactored such that implementing a change like
> this wouldn't be a big deal

You misunderstand.  The implementation difficulty, such as it is,
is secondary to whether or not changing the grammar like that is a
good idea in the first place.  I don't think it is, and I don't
think that adding syntactic sugar is a compelling reason to change
that.

-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, ITS, CWRUc...@case.eduhttp://cnswww.cns.cwru.edu/~chet/



Re: command substitution is stripping set -e from options

2015-10-08 Thread Christoph Gysin
> I think you're overlooking what I referred to above: that the exit status
> of a command substitution doesn't have any effect on whether the parent's
> command succeeds or fails except in one case: the right-hand-side of an
> assignment statement that is the last assignment in a command consisting
> only of assignment statements.  To say that it `disables the whole point
> of set -e' is a considerable overstatement.

Well, I do see your point. But my understanding was that if I wanted
to run all my bash code with set -e error checking, I can do so by
avoiding a couple of corner cases, namely:

Instead of:

  local var=$(cmd)

I use:

  local var
  var=$(cmd)

and instead of:

  command $(cmd)

I use:

  var=$(cmd)
  command $var

etc.

But this issue brings a new corner case:

  func() {
cmd1
cmd2
  }

  var=$(func)

This won't work, because set -e is stripped inside the substitution,
so the whole function runs without error checking.

So while set -e has it's issues, it is still very useful to have all
commands error checked if one is aware of the corner cases. Well, all
except the one I described, which seems impossible to work around.

The only workaround I can think of is to put set -e in the beginning
of every function. That would make it possible to catch errors in the
example above. But it is really ugly. I would like to set -e once in
the beginning of the script, and not all over in every function.

Is there another way to achieve this?

Is it out of the question to change this behaviour?

If so, would you accept a patch that adds an option to enable that behaviour?

Thanks,
Chris
-- 
echo mailto: NOSPAM !#$.'<*>'|sed 's. ..'|tr "<*> !#:2" org@fr33z3



Re: command substitution is stripping set -e from options

2015-10-08 Thread Chet Ramey
On 10/5/15 5:37 PM, Christoph Gysin wrote:
>> The parent shell (the one that enabled -e) should be the one to make the
>> decision about whether or not the shell exits.  The exit status of the
>> command substitution doesn't make a difference except in one special case,
>> so inheriting errexit and exiting (possibly prematurely) doesn't really
>> help the parent decide whether or not to exit.
> 
> I'm not sure I fully understand.
> 
> The parent shell should be the one to decide if the script is supposed
> to abort on any unsuccessful exit status. Command substitution should
> not change that. The parent shell decided via set -e that it wants to
> exit immediately on error.

Sure, the parent shell should make this decision.  Consider that, other
than one special case, the exit status of a command substitution has no
effect on whether a command succeeds or fails, and whether or not the
parent shell exits.  If you have something like

set -e
command -f $(command that generates a filename) -opts other arguments

and the command substitution inherits the -e option and exits
prematurely, the parent shell will not inspect its exit status and will
happily execute the command, possibly with faulty data.  This is simply
how command substitution works.

> If you don't want to fix this for backwards compatibility, is there
> anyway we could change that behaviour explicitly? I.e. with another
> option? Avoiding command substitution isn't really an option, and this
> essentially disables the whole point of set -e.

I think you're overlooking what I referred to above: that the exit status
of a command substitution doesn't have any effect on whether the parent's
command succeeds or fails except in one case: the right-hand-side of an
assignment statement that is the last assignment in a command consisting
only of assignment statements.  To say that it `disables the whole point
of set -e' is a considerable overstatement.

-- 
``The lyf so short, the craft so long to lerne.'' - Chaucer
 ``Ars longa, vita brevis'' - Hippocrates
Chet Ramey, ITS, CWRUc...@case.eduhttp://cnswww.cns.cwru.edu/~chet/



syntactic sugar = fewer bugs, better maintenance

2015-10-08 Thread Linda Walsh



Chet Ramey wrote:

I understand the problems of working with 10+ year old code that's
been patched through the roof and trying to add _anything_ to the
design.  Thus the proposal of keeping the idea around if bash was
ever refactored such that implementing a change like this wouldn't be
a big deal


You misunderstand.  The implementation difficulty, such as it is, is
secondary to whether or not changing the grammar like that is a good
idea in the first place.  I don't think it is, and I don't think that
adding syntactic sugar is a compelling reason to change that.




That URL I mentioned included *shell*, by a hair, as being the *most
wordy* to get anything done in the top-tier popularity group.  The
median differences between the low and high expressive groups was 31X
(i.e. the low-expressive langs took 31 times as many lines to do the
same thing as high-expressive langs.  The main thing that helped shell
was *consistency* though it was also one of the more wordy, less
readable and less maintainable.

(URL
= 
http://redmonk.com/dberkholz/2013/03/25/programming-languages-ranked-by-expressiveness/
)

Shellscript made the bottom of the cutoff for the tier-one popularity.

No tier-one languages fall in the top 25 on both metrics, although
5 make the cut on consistency alone. Of the tier-one languages,
lower-level ones tend to be both inconsistent and overly wordy, while
higher-level ones have intermediate wordiness and very strong
consistency.

The most consistent languages are Python, Objective-C, Perl, C#, and
***shell***, ***with the presence of *Perl* and *shell* supporting
the initial assertion that:

  **expressiveness has little to do with readability or maintainability**. 


 I.e. shell is at the bottom in terms of readability and
 maintainability. 


According to this study and the rosetta-stone study (comparing same-prob
implementations in different languages), *SHELL* is in drastic need of
improvements in readability and maintainability --- which is exactly
what we are talking about in creating syntax simplifications that make
it more readable and increase maintainability.

Your dismissive attitude of readability and maintainability as
"syntactic sugar" is unfortunate -- *IF* the implementation of such
became easier due to refactoring.

The difficulties in creating such simple extensions speaks to
maintainability.

I well understand that point -- in so much that I wrote a 'snapshot'
program in shell -- @938 lines.  It worked, but the first time something
needed to change, it was unmaintainable and inextensible, despite my
best efforts to make it so.  It was a state machine -- that kept track
of what it had done -- so that it wouldn't lose data or overwrite data.

sections of code called checks for dependencies and added flags for
positive checks.

Eventually rewrote it in perl (only slightly more maintainable) -- but
with considerably more error checking and checkpointing so if a problem
arose, usually, simply re-running the script found the last good step,
and recovered from there.  Due to all the extra transaction recording
and checking, the perl version's main program ran 2343 lines (which
isn't including multiple libraries that got written.

What you and Andreas seem to dismissively call syntactic sugar is
increased ease of use, maintainability and a lower overall bug-count due
to the need of fewer lines.  "Syntactic sugar" isn't gratuitous sugar.






Re: command substitution is stripping set -e from options

2015-10-08 Thread Christoph Gysin
> I know you don't want to hear this, but you really need to stop thinking
> of set -e as "error checking".  It is an obsolescent historical anomaly
> that bash is required to support because POSIX specifies it.  It isn't
> useful for any purpose, and people who insist on using it are simply
> causing extra pain for themselves.

This is simply not true. It provides the automatic "|| exit 1" if you
know what special cases you need to avoid. I believe I am aware of all
those special cases. I'm willing to take that pain for the advantage
of having the script fail when any command called unexpectedly fails.
It saves me from the even greater pain of debugging the root cause
without any hint where it started going wrong.

> If you dislike this, then switch your project to a programming language
> that *has* automatic error checking.  Bash is just a shell, and there
> are many other languages that may better suit your project.

Unfortunately bash is the default system shell on a gazillion of
devices out there, so switching "my project" is not as trivial as it
might sound.

Chris
-- 
echo mailto: NOSPAM !#$.'<*>'|sed 's. ..'|tr "<*> !#:2" org@fr33z3



Re: command substitution is stripping set -e from options

2015-10-08 Thread Greg Wooledge
On Thu, Oct 08, 2015 at 09:36:59PM +0300, Christoph Gysin wrote:
> But this issue brings a new corner case:
> 
>   func() {
> cmd1
> cmd2
>   }
> 
>   var=$(func)
> 
> This won't work, because set -e is stripped inside the substitution,
> so the whole function runs without error checking.

I know you don't want to hear this, but you really need to stop thinking
of set -e as "error checking".  It is an obsolescent historical anomaly
that bash is required to support because POSIX specifies it.  It isn't
useful for any purpose, and people who insist on using it are simply
causing extra pain for themselves.

Bash does not have automatic error checking.  Instead, it uses the C
model: you need to check for erroneous results yourself, after every
command whose results you actually care about.

If you dislike this, then switch your project to a programming language
that *has* automatic error checking.  Bash is just a shell, and there
are many other languages that may better suit your project.