On 23.01.2013 23:41, Cláudio Brisolara wrote:
Hello R-users
I am getting error messagens when I require some packages or execute some
procedures, like these below:
require(tseries)
Loading required package: tseries
Error in get(Info[i, 1], envir = env) :
cannot allocate memory block
Hello R-users
I am getting error messagens when I require some packages or execute some
procedures, like these below:
> require(tseries)
Loading required package: tseries
Error in get(Info[i, 1], envir = env) :
cannot allocate memory block of size 2.7 Gb
> require (TSA)
Loading required
On 16.02.2011 22:38, poisontonic wrote:
Uwe Ligges-3 wrote:
If the available space got too fragmented, there is not single 3.8 block
of memory available any more
Is there anything I can do to prevent this?
If you did it after a fresh reboot: I don't see a way to prevent it.
Neverth
Uwe Ligges-3 wrote:
>
> If the available space got too fragmented, there is not single 3.8 block
> of memory available any more
>
Is there anything I can do to prevent this? I've restarted and rerun the
whole thing straight up, and still the error...?
Ben
--
View this message in context
On 15.02.2011 21:05, poisontonic wrote:
Hi, I'm using the latest version of 64-bit R for Windows: R x64 2.12.1
I'm using it because I currently need to do hierarchical clustering on a
very large object (too big for MATLAB, which I normally use).
When I try to cluster my distance matrix d (obt
Hi, I'm using the latest version of 64-bit R for Windows: R x64 2.12.1
I'm using it because I currently need to do hierarchical clustering on a
very large object (too big for MATLAB, which I normally use).
When I try to cluster my distance matrix d (obtained using dist on my design
matrix):
hc <-
Thanks for the responses
@Patrik Burns
I'm going to try running on a 64 bit machine. Unfortunately R isn't
installed properly on it yet and our admin guy is away, so it'll have to
wait.
@ Uwe Ligges
Unless the program suddenly starts generating masses and masses of data, I
don't think this is t
davew wrote:
Hi all,
I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB).
I found dozens of threads regardi
Hi all,
I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB).
I found dozens of threads regarding this problem, but
>> I am getting "Error: cannot allocate vector of size 197 MB".
>> I know that similar problems were discussed a lot already, but I
>> didn't find any satisfactory answers so far!
>>
>> Details:
>> *** I have XP (32bit) with 4GB ram. At the time when the problem
>> appeared I had 1.5GB of available
Em Ter, 2008-09-23 às 21:42 -0400, DumpsterBear escreveu:
> I am getting "Error: cannot allocate vector of size 197 MB".
> I know that similar problems were discussed a lot already, but I
> didn't find any satisfactory answers so far!
>
> Details:
> *** I have XP (32bit) with 4GB ram. At the time
DumpsterBear wrote:
I am getting "Error: cannot allocate vector of size 197 MB".
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!
Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of avail
I am getting "Error: cannot allocate vector of size 197 MB".
I know that similar problems were discussed a lot already, but I
didn't find any satisfactory answers so far!
Details:
*** I have XP (32bit) with 4GB ram. At the time when the problem
appeared I had 1.5GB of available physical memory.
**
13 matches
Mail list logo