Suggestions...
Post plain text (you reduce your own chances of getting feedback by failing to
do this in your email program)
Provide sample data and code
Buy more RAM
use data.table package and fread
load and analyze subsets of data
Put the data into a database (e.g. sqlite?)
If these sugge
Well, i'm no expert on these topics, but if its 2.7 gig and R can maximally use
2gig, then the easiest solution would be giving R more memory. Did you read
through help(memory.size) as the error suggested?
try calling memory.size(T) or memory.limit(3000) and see if it works.
I don't have any ex
Hello Vignesh, we did not get any attachments, maybe you could upload them
somewhere?
On 19.10.2012, at 09:46, Vignesh Prajapati wrote:
> As I found the memory problem with local machine/micro instance(amazon) for
> building SVM model in R on large dataset(2,01,478 rows with 11 variables),
> the
On 02/03/2012 23:36, steven mosher wrote:
1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB
then you can allocate
more memory with memory.size()
Actually, this looks like 32-bit Windows (unstated), so you cannot. See
the rw-FAQ for things your sysadmin can do even
1. How much RAM do you have (looks like 2GB ) . If you have more than 2GB
then you can allocate
more memory with memory.size()
2. If you have 2GB or less then you have a couple options
a) make sure your session is clean of unnecessary objects.
b) Dont read in all the data if you dont
Let's see...
You could delete objects from your R session.
You could buy more RAM.
You could see help(memory.size).
You could try googling to see how others have dealt with memory
management in R, a process which turns up useful information like
this: http://www.r-bloggers.com/memory-management-in
Thanks for constrctive comments. I was very careful when I wrote the code. I
wrote many functions and then wrapped up to get a single function.
Originally, I used optim() to get MLE, it was at least 10 times slower than
the code based on Newton method. I also vectorized all objects whenever
possib
Hi:
Are you running 32-bit or 64-bit R? For memory-intensive processes like
these, 64-bit R is almost a necessity. You might also look into more
efficient ways to invert the matrix, especially if it has special properties
that can be exploited (e.g., symmetry). More to the point, you want to
compu
Dear Alex,
Has manual garbage collection had any effect?
Sincerely,
KeithC.
-Original Message-
From: Alex van der Spek [mailto:do...@xs4all.nl]
Sent: Wednesday, May 05, 2010 3:48 AM
To: r-help@r-project.org
Subject: [R] Memory issue
Reading a flat text file 138 Mbyte large into R with
Thank you all,
No offense meant. I like R tremendously but I admit I am only a
beginner. I did not know about gc(), but it explains my confusion about
rm() not doing what I expected it to do.
I suspected that .RData was a compressed file. Thanks for the
confirmation. As for Windows, unfortun
On Wed, 5 May 2010, Alex van der Spek wrote:
Reading a flat text file 138 Mbyte large into R with a combination of scan
(to get the header) and read.table. After conversion of text time stamps to
POSIXct and conversion of integer codes to factors I convert everything into
one data frame and re
I had similar issues with memory occupancy. You should explicitly call
gc() to call the garbage collector (free memory routine) after you do
rm() of the big objects.
D.
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
P
Daniel Brewer wrote:
I have a script that sometimes produces the following error:
Error in assign(".target", met...@target, envir = envir) :
formal argument "envir" matched by multiple actual arguments
Do you think this is a memory issue? I don't know what else it could be
as it doesn't alwa
13 matches
Mail list logo