On Sun, 2007-02-04 at 11:42 -0700, Bob Proulx wrote:
>
> BTW... Awk is used so much on a system that most likely it is already
> in ram and is probably not as heavy of a system impact as you imply.
I don't think I was even considering the disk read time to load it. I
think I was purely consideri
On Sat, 2007-02-03 at 23:30 -0500, Paul Jarc wrote:
> "Brian J. Murrell" <[EMAIL PROTECTED]> wrote:
> > < <(cat $file)
>
> http://partmaps.org/era/unix/award.html
LOL. Too right. I am just so used to using process redirection to
solve the old "but my variables don't maintain their value after m
Brian J. Murrell wrote:
> Bob Proulx wrote:
> > echo one two three four five six seven | awk '{print$2,$NF}'
> > two seven
>
> That one always drives me nuts. Why fork/exec for such a heavy process
> for something bash can do itself:
In the end the real answer is that to me it's simpler, les
"Brian J. Murrell" <[EMAIL PROTECTED]> wrote:
> < <(cat $file)
http://partmaps.org/era/unix/award.html
paul
___
Bug-bash mailing list
Bug-bash@gnu.org
http://lists.gnu.org/mailman/listinfo/bug-bash
On Sat, 2007-02-03 at 10:59 -0700, Bob Proulx wrote:
>
> echo one two three four five six seven | awk '{print$2,$NF}'
> two seven
That one always drives me nuts. Why fork/exec for such a heavy process
for something bash can do itself:
cat $file | while read column1 rest; do
echo $column
flowdimow wrote:
> I have a text file (a table) and I want to read it and output only a
> defined number of colums. Does anybody know how to do this?
Depending on if the data is by character column or by whitespace
separated field I would use either cut or awk.
echo abcdefghijklmnopqrstuvwxyz