(1) That's an old version of R Studio, although I doubt that that's
the source of your problem.
(2) What is your session info?
> sessionInfo()
or
> devtools::session_info()
I just allocated a numeric vector of size 2.5e9 on a 16GB linux box (R
3.4.3). It worked, but it pretty much exhau
[R] Cannot allocate vector of size x
how much memory do you have on your system? How large are the vectors you are
creating? How many other large vectors do you have in memory? Remove all
unused objects and do gc() to reclaim some of the memory. Remember all objects
are in memory and you have to
how much memory do you have on your system? How large are the vectors you are
creating? How many other large vectors do you have in memory? Remove all
unused objects and do gc() to reclaim some of the memory. Remember all objects
are in memory and you have to understand how large they are and h
Check one of the examples in
?try
It has this heading:
## run a simulation, keep only the results that worked.
If your system is Windows, you can also try to increase the memory available
for one application, in order to avoid the problem.
Do a search for "3GB switch"
HTH
Dr. Ruben H. Roa-Ur
Thanks again Hugo,
2011/1/19 Hugo Mildenberger :
> Dear Mauricio,
>
> what I do not understand at all is the message:
>
>> Error: cannot allocate vector of size 476.2 Mb
>
> Have you tried to allocate a big matrix in between, whith
> the R - statement not being shown in the output?
If I understoo
Dear Mauricio,
what I do not understand at all is the message:
> Error: cannot allocate vector of size 476.2 Mb
Have you tried to allocate a big matrix in between, whith
the R - statement not being shown in the output? Probably
not. If not, your local R version is buggy for sure. The test shows
Dear Hugo,
I tried your memory test program (without further modifications) just
after the gc() command:
print(gc())
print(gcinfo(TRUE))
system("/mypath/memorytest.out")
and the result that I got was:
Number of simulations read from ' Particles.txt ' : 9000
-
> I got the following warning:
> memorytest.c: In function ‘main’:
> memorytest.c:5: warning: return type of ‘main’ is not ‘int’
> Is this important ?
Hello Mauricio,
No, your gcc version is unduly puristic here. The traditional return
type of the main function in "C" should be "int", and i
Thank you very much Hugo for your answer.
Yesterday I was out of my office and I couldn't test the advise you
gave me. Today I'll do it.
I have never used C, so I have to ask.
For creating an executable file with your code, I copied and pasted
the text into a text file, and then tried from the
Mauricio,
I tried your matrix allocation on Gentoo-hardened 32 and
64 bit systems. Both work ok, using R-2.11.1 and R-2.12.2 respectively,
and both use a recent 2.6.36 kernel revision.
This is from the 32 bit system with 512 MB physical memory:
>system("free")
total
Following the advice a colleague, I put the gc() and gcinfo(TRUE)
commands just before the line I got the problem, and their output
were:
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 471485 12.61704095 45.6 7920371 211.5
Vcells 6408885 48.9 113919753 869.2 34765159
Thanks for your answer Martin, but -unfortunately- the decision about
installing a 32 bits OS in the 64 bits machine, was taken by the IT
guys of my work and not by me.
By the way, due to strong limitations about software installation in
my work place, this problem didn't happen in Ubuntu, but in
> "MZ" == Mauricio Zambrano
> on Mon, 17 Jan 2011 11:46:44 +0100 writes:
MZ> Dear R community,
MZ> I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a
MZ> PAE kernel, as you can see here:
MZ> $ uname -a
MZ> Linux mymachine 2.6.18-238.el5PAE #1 SM
I would suggest two things here:
check on the size of other object you may have stored in memory, and get rid
of what you don't need.
? ls
? rm
also, consider running garbage collection to help free up memory in R
gc()
I hope this helps!
A
On Tue, Aug 31, 2010 at 1:56 AM, rusers.sh wrote:
>
On Mon, 15 Dec 2008, tsunhin wong wrote:
Dear R Users,
I was running some data analysis scripts and ran into this error:
Error: cannot allocate vector of size 27.6 Mb
Doing a "memory.size(max=TRUE)" will give me:
[1] 1506.812
The current situation is:
I'm working on a Windows Vista 32bit lap
Also,each data.frame of the 1500 working as data sources floating in
the global environment is of a size ranging from 2000x36 to 9000x36
Please help...! THANKS!!!
- John
On Mon, Dec 15, 2008 at 1:12 PM, tsunhin wong wrote:
> Dear R Users,
>
> I was running some data analysis scripts and ran into
See the FAQs and ?Memory.
Uwe Ligges
silvia narduzzi wrote:
Hi the list,
I have a problem of memory space while running step() function:
Error: cannot allocate vector of size 50.9 Mb
I've tried with:
memory.size(max = FALSE)
[1] 803.4714
#which should be the amount of memory currently
17 matches
Mail list logo