Having discovering 'trap' I scripted this:
declare -a queue[]
function q() {
[EMAIL PROTECTED]"cd `pwd` && $@"
}
function runq() {
if [ -n "$queue" ]; then
local command=$queue
queue=("[EMAIL PROTECTED]:1}")
bash -c "($command; kill -33 $$)" &
fi
}
trap 'runq' 33
which works almost a
Mårten Segerkvist <[EMAIL PROTECTED]> wrote:
> i. e. being able to split a one-liner like:
>
> command1 && command2 && command3
>
> into several, separate command lines:
You can write that one-liner on multiple lines:
command1 &&
command2 &&
command3
paul
__
Mårten Segerkvist wrote:
> command1 &
> %1 && command2 &
> %2 && command3
>
> (where the second command line awaits the execution of the first etc.)
In a script you can grab the process id of the last background job
with $!. Then you can wait for that job id.
command &
wait $! && command2 &
Don't know if this is the right place for this sort of thing, but is it
somehow possible to do some sort of consecutive job processin in bash,
i. e. being able to split a one-liner like:
command1 && command2 && command3
into several, separate command lines:
command1 &
%1 && command2 &
%2 && c