It could be logical leaks, or whatever is the correct english term for
them. Memory that's used, and kept track of, but not used again, and not
freed until the program shuts down. The memory usage is constantly
increasing. I have a process using 3 gigs now, and it just runs one of
those testcases (on a lot more data).
On 13/06/10 22:21, Chet Ramey wrote:
On 6/13/10 11:17 AM, oyvi...@dhampir.no wrote:
Configuration Information [Automatically generated, do not change]:
Machine: i486
OS: linux-gnu
Compiler: gcc
Compilation CFLAGS: -DPROGRAM='bash' -DCONF_HOSTTYPE='i486'
-DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='i486-pc-linux-gnu'
-DCONF_VENDOR='pc' -DLOCALEDIR='/usr/share/locale' -DPACKAGE='bash' -DSHELL
-DHAVE_CONFIG_H -I. -I../bash -I../bash/include -I../bash/lib -g -O2 -Wall
uname output: Linux vampiric 2.6.32-3-686-bigmem #1 SMP Thu Feb 25 06:54:30 UTC
2010 i686 GNU/Linux
Machine Type: i486-pc-linux-gnu
Bash Version: 4.1
Patch Level: 5
Release Status: release
Description:
When used in a script that iterates over several thousand lines of logs or
similar data, the bash string replacement functions seem to leak memory. The Repeat-By
list uses "ls -lR" to generate input, but any data will do (try your system
logs)
Repeat-By:
Start a shell, and start "top" or some other resource monitoring tool
Try one of the following:
while read line; do test=${line%%/*}; done< <(ls -lR)
while read line; do test=${line//a/b}; done< <(ls -lR)
while read line; do test=${line#\ }; done< <(ls -lR)
Also, geirha in #bash on irc.freenode.net suggested:
var=x; for j in {1..50}; do ps -osize=,vsize= -p $$; for i in
{1..1000}; do var=${var%%*/}; done; done;
valgrind detects no memory leaks for any of these cases.