Thanks for responding.
I don't think it's that simple. That's a soft limit, the hard limit
is "unlimited."
The results of gc() in the original post indicated that R could
utililize more than 32MB of RAM.
My sysadmin had already increased my memory limits prior to my
posting.
Just to confirm here are the results with ulimit -m set to unlimited
prior to calling R.
> xx <- matrix(rep(1e+10,1e7),nrow=1e4,ncol=1e3)
> object.size(xx)/1024^2
[1] 76.29405
> system("ulimit -m")
unlimited
> tmp.df <- as.data.frame(cbind(xx,xx,xx))
Error: cannot allocate vector of size 228.9 Mb
----- Original Message -----
From: "Hin-Tak Leung" <[EMAIL PROTECTED]>
To: "Jason Barnhart" <[EMAIL PROTECTED]>
Cc: <[EMAIL PROTECTED]>
Sent: Monday, May 21, 2007 11:02 AM
Subject: Re: [Rd] AIX testers needed
> Jason Barnhart wrote:
>> Thank you for responding.
>>
>> I should have added -a on my ulimit command. Here are its results;
>> which I believe are not the limiting factor.
>>
>> %/ > ulimit -a
>> core file size (blocks, -c) 1048575
>> data seg size (kbytes, -d) unlimited
>> file size (blocks, -f) unlimited
>> max memory size (kbytes, -m) 32768
>> open files (-n) 2000
>> pipe size (512 bytes, -p) 64
>> stack size (kbytes, -s) hard
>> cpu time (seconds, -t) unlimited
>> max user processes (-u) 128
>> virtual memory (kbytes, -v) unlimited
>
> you think max memory = 32768k (or 32MB) is not limiting?
> Please think again...
>
> HTL
>
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel