On 22 Jun 2003, Fabio Alemagna wrote: > That happens when using a construct like this: > > define function > target: very long dependency list > whatever > endef > > and then passign that to $(eval) trough $(call). By reading the ml > archives I got to know that this is a pretty old bug, which should have > supposedly been fixed in a release which, at the time, was about to be > made. However it appears that there have no been any more releases since > then... Am I to assume that this bug will never be fixed (I read about the > patch, but I also read it doesn't work as it should)? > > Fabio Alemagna
Hello Fabio, I get a message like this on Cray systems when running my make process in an NQS (qsub) script with a prescribed amount of memory. Changing the amount of memory alloted changes the point at which the error occurs. My version of the "make: *** virtual memory exhausted. Stop." bug occurs from trying to include 1000's of files. It doesn't matter how large the files are --- what is important is the length of the files' pathnames. If I allow 128 mega words (1GB) for the qsub job, the error occurs with about 5000 80 character filenames. If I allow 16MW or 128MB for the job, the error occurs with about 1500 included files of about 80-character pathnames each. In earlier correspondence with Paul Smith, he indicated that he thought there may be a memory leak in read.c. I have verified that the bug has not been fixed in current CVS. Sure would be nice if somebody with access to purify could run it on GNU make ... Ted -- Ted Stern Applications Group Cray Inc. office: 206-701-2182 411 First Avenue South, Suite 600 cell: 206-383-1049 Seattle, WA 98104-2860 FAX: 206-701-2500 Frango ut patefaciam -- I break that I may reveal (The Paleontological Society motto, equally apropos for debugging) _______________________________________________ Bug-make mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/bug-make