-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hey there,
I just encountered some dubious behaviour... Feeding a large data set in a "for i in..." loop doing nothing causes exponentially large memory usage, and that memory is never freed (although is seems to be re-used over multiple runs...) For instance I have 68M of small numbers (1..256) separated by newlines and that makes bash grow over 1.6G, even when ann it does inside the loop is calling true. The only way I can free up the memory is to leave the shell. You can test easily with this command (you might want to limit your memory with ulimit first to avoid trashing your system...): $ for i in `seq 1 10000000`; do true; done On my test system this requires 1G of memory, and memory climbs a bit higher on additional runs but settles at 1.1G (it doesn't seem to leak any memory part this point. This is running: GNU bash, version 4.1.7(2)-release (i486-slackware-linux-gnu) On Slackware-13.1 I also verified this behaviour on: GNU bash, version 4.1.5(1)-release (x86_64-pc-linux-gnu) (Ubuntu Lucid) Is this normal or expected? Thanks - -- Thomas -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAkygDXQACgkQ6dZ+Kt5BchbETwCgxflyi4SH3S2j9e+tpPz6gUay dU0An10OakENErpDNNFZNQgoQbZwsJ+B =ZlLd -----END PGP SIGNATURE-----