Configuration Information [Automatically generated, do not change]: Machine: i486 OS: linux-gnu Compiler: gcc Compilation CFLAGS: -DPROGRAM='bash' -DCONF_HOSTTYPE='i486' -DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='i486-pc-linux-gnu' -DCONF_VENDOR='pc' -DLOCALEDIR='/usr/share/locale' -DPACKAGE='bash' -DSHELL -DHAVE_CONFIG_H -I. -I../bash -I../bash/include -I../bash/lib -g -O2 -Wall uname output: Linux vampiric 2.6.32-3-686-bigmem #1 SMP Thu Feb 25 06:54:30 UTC 2010 i686 GNU/Linux Machine Type: i486-pc-linux-gnu
Bash Version: 4.1 Patch Level: 5 Release Status: release Description: When used in a script that iterates over several thousand lines of logs or similar data, the bash string replacement functions seem to leak memory. The Repeat-By list uses "ls -lR" to generate input, but any data will do (try your system logs) Repeat-By: Start a shell, and start "top" or some other resource monitoring tool Try one of the following: while read line; do test=${line%%/*}; done < <(ls -lR) while read line; do test=${line//a/b}; done < <(ls -lR) while read line; do test=${line#\ }; done < <(ls -lR) Also, geirha in #bash on irc.freenode.net suggested: var=x; for j in {1..50}; do ps -osize=,vsize= -p $$; for i in {1..1000}; do var=${var%%*/}; done; done; Fix: Running the replacement in a subshell is a slow workaround, but it takes 50% more time for 50000 lines in my test. while read line; do test=$(echo ${line//a/b}); done < <(ls -lR)