ndi...@gmail.com]
> Sent: 27 March 2012 18:14
> To: R. Michael Weylandt
> Cc: r-help@r-project.org
> Subject: Re: [R] Memory Utilization on R
>
> Thank you for the modified script! I have now tried on different datasets
> and it works very well and is dramatically faster
Note that you can actually drop the line defining the big list "x". I
thought it would be needed, but it turns out to be unnecessary after
cleaning up the second half: cutting off that allocation might save
you even more time.
Best,
Michael
On Tue, Mar 27, 2012 at 11:14 AM, Kurinji Pandiyan
wrot
Thank you,
-Alex
From: r-help-boun...@r-project.org [r-help-boun...@r-project.org] on behalf of
Kurinji Pandiyan [kurinji.pandi...@gmail.com]
Sent: 27 March 2012 18:14
To: R. Michael Weylandt
Cc: r-help@r-project.org
Subject: Re: [R] Memory Utilization on R
Thank you for the modified script! I have now tried on different datasets
and it works very well and is dramatically faster than my original script!
I really appreciate the help.
Kurinji
On Fri, Mar 23, 2012 at 1:33 PM, R. Michael Weylandt <
michael.weyla...@gmail.com> wrote:
> Taking a look at
Taking a look at your script: there are a some potential optimizations
you can do:
# Fine
poi <- as.character(top.GSM396290) #5000 characters
x.data <- h1[,c(1,7:9)] # 485577 obs of 4 variables
# Pre-allocate the space
x <- vector("list", 485577) # x <- list()
# Do the "a" stuff once outside th
Yes, I am.
Thank you,
Kurinji
On Mar 22, 2012, at 10:27 PM, "R. Michael Weylandt"
wrote:
> Use 64bit R?
>
> Michael
>
> On Thu, Mar 22, 2012 at 5:22 PM, Kurinji Pandiyan
> wrote:
>> Hello,
>>
>> I have a 32 GB RAM Mac Pro with a 2*2.4 GHz quad core processor and 2TB
>> storage. Despite thi
Well... what makes you think you are hitting memory constraints then?
If you have significantly less than 3GB of data, it shouldn't surprise
you if R never needs more than 3GB of memory.
You could just be running your scripts inefficiently...it's an extreme
example, but all the memory and gigaflop
Use 64bit R?
Michael
On Thu, Mar 22, 2012 at 5:22 PM, Kurinji Pandiyan
wrote:
> Hello,
>
> I have a 32 GB RAM Mac Pro with a 2*2.4 GHz quad core processor and 2TB
> storage. Despite this having so much memory, I am not able to get R to
> utilize much more than 3 GBs. Some of my scripts take hour
Hello,
I have a 32 GB RAM Mac Pro with a 2*2.4 GHz quad core processor and 2TB
storage. Despite this having so much memory, I am not able to get R to
utilize much more than 3 GBs. Some of my scripts take hours to run but I
would think they would be much faster if more memory is utilized. How do I
9 matches
Mail list logo