Well... what makes you think you are hitting memory constraints then?
If you have significantly less than 3GB of data, it shouldn't surprise
you if R never needs more than 3GB of memory.

You could just be running your scripts inefficiently...it's an extreme
example, but all the memory and gigaflopping in the world can't speed
this up (by much):

for(i in seq_len(1e6)) Sys.sleep(10)

Perhaps you should look into profiling tools or parallel
computation...if you can post a representative example of your
scripts, we might be able to give performance pointers.

Michael

On Fri, Mar 23, 2012 at 1:33 AM, Kurinji Pandiyan
<kurinji.pandi...@gmail.com> wrote:
> Yes, I am.
>
> Thank you,
> Kurinji
>
> On Mar 22, 2012, at 10:27 PM, "R. Michael Weylandt" 
> <michael.weyla...@gmail.com> wrote:
>
>> Use 64bit R?
>>
>> Michael
>>
>> On Thu, Mar 22, 2012 at 5:22 PM, Kurinji Pandiyan
>> <kurinji.pandi...@gmail.com> wrote:
>>> Hello,
>>>
>>> I have a 32 GB RAM Mac Pro with a 2*2.4 GHz quad core processor and 2TB
>>> storage. Despite this having so much memory, I am not able to get R to
>>> utilize much more than 3 GBs. Some of my scripts take hours to run but I
>>> would think they would be much faster if more memory is utilized. How do I
>>> optimize the memory usage on R by my Mac Pro?
>>>
>>> Thank you!
>>> Kurinji
>>>
>>>        [[alternative HTML version deleted]]
>>>
>>> ______________________________________________
>>> R-help@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to