On Mon, 22 Dec 2008, iamsilvermember wrote:
dim(data)
[1] 22283 19
dm=dist(data, method = "euclidean", diag = FALSE, upper = FALSE, p = 2)
Error: cannot allocate vector of size 1.8 Gb
That would be an object of size 1.8Gb.
See ?"Memory-limits"
Hi Guys, thank you in advance for helping. :-D
Recently I ran into the "cannot allocate vector of size 1.8GB" error. I am
pretty sure this is not a hardware limitation because it happens no matter I
ran the R code in a 2.0Ghz Core Duo 2GB ram Mac or on a Intel Xeon 2x2.0Ghz
quard-core 8GB ram Linux server.
Why? Both will have a 3GB address space limits unless the Xeon box is
64-bit. And this works on my 64-bit Linux boxes.
I also tried to clear the workspace before running the code too, but it
didn't seem to help...
Weird thing though is that once in a while it will work, but next when I run
clustering on the above result
hc=hclust(dm, method = "complete", members=NULL)
it give me the same error...
See ?"Memory-limits" for the first part.
I searched around already, but the memory.limit, memory.size method does not
seem to help. May I know what can i do to resolve this problem?
What are you going to do with an agglomerative hierarchical clustering of
22283 objects? It will not be interpretible.
Thank you so much for your help.
--
Brian D. Ripley, rip...@stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.