Thanks to you all for your replies. I didn't realize bigmemory is only
available in Unix environments - when I saw
> install.packages('bigmemory')
Installing package into C:/Users/BenC/Documents/R/win-library/3.0
(as lib is unspecified)
--- Please select a CRAN mirror for use in this session -
On 29/04/2013 23:46, Benjamin Caldwell wrote:
Dear helpers,
Does anyone have information on the status of bigmemory and R3.0? Will it
just take time for the devs to re-code for the new environment? Or is there
an alternative for this new version?
What are you asking about? 'bigmemory' has bee
On 29 April 2013 at 15:46, Benjamin Caldwell wrote:
| Dear helpers,
|
| Does anyone have information on the status of bigmemory and R3.0? Will it
| just take time for the devs to re-code for the new environment? Or is there
| an alternative for this new version?
It just works, with R 3.0.0 and o
Dear helpers,
Does anyone have information on the status of bigmemory and R3.0? Will it
just take time for the devs to re-code for the new environment? Or is there
an alternative for this new version?
Thanks
Ben Caldwell
[[alternative HTML version deleted]]
OK, did a test where I did both - wrote a ~6Mx58 double matrix as a .txt file
(write.big.matrix), but also left the backing file + descriptor file as-is
(rather than deleting it as I usually do). Opened a different R session.
Compared contents of first 100 rows of both, they seem identical.
Size-wi
Hi,
Does the backing file of a big.matrix store the contents of entire matrix?
Or does it store the portion of it that is not stored in RAM? In other
words, can the backing file be treated as a file containing the matrix's
full data?
I have been writing my big.matrix objects to disk (write.big.mat
Hi Allie,
When you are working with the ff package, the counterpart of a data.frame is
called an ffdf (ff data frame). It can handle the types you are talking
about - factor, integer but characters will be stored as factors. So this
means that your data types do not have to be of 1 specific type.
i believe ff has a dataframe class. as for your object data im less clear.
how big is it
On Oct 18, 2012 12:45 PM, "Alexander Shenkin" wrote:
> Hi Folks,
>
> I've been bumping my head against the 4GB limit for 32-bit R. I can't
> go to 64-bit R due to package compatibility issues (ROBDC - possib
System Info:
R 2.14.2
Windows 7 Pro x64 SP1
8GB RAM
On 10/18/2012 3:42 PM, Alexander Shenkin wrote:
> Hi Folks,
>
> I've been bumping my head against the 4GB limit for 32-bit R. I can't
> go to 64-bit R due to package compatibility issues (ROBDC - possible but
> painful, xlsReadWrite - not possi
Hi Folks,
I've been bumping my head against the 4GB limit for 32-bit R. I can't
go to 64-bit R due to package compatibility issues (ROBDC - possible but
painful, xlsReadWrite - not possible, and others). I have a number of
big dataframes whose columns all sorts of data types - factor,
character,
Hi Jay,
Thanks for the reply:)
Could you show me a link of the c++ examples? Thank you very much.
ya
ya
From: Jay Emerson
Date: 2012-05-11 20:12
To: xinxi813
CC: r-help
Subject: Re: bigmemory
R internally uses 32-bit integers for indexing (though this may change). For
this and other reaso
R internally uses 32-bit integers for indexing (though this may change).
For this and other reasons these external objects with specialized purposes
(larger-than-RAM, shared memory) simply can't behave exactly as R objects.
Best case, some R functions will work. Others would simply break. Others
Hi Jay,
I have a question about your reply.
You mentioned that "the more serious problem is that you can't expect to run
just any R function on a big.matrix (or on an ff object, if you check out ff
for some nice features). "
I am confused why the packages could not communicate with each ot
To answer your first question about read.big.matrix(), we don't know what
your acc3.dat file is, but it doesn't appear to have been detected as a
standard file (like a CSV file) or -- perhaps -- doesn't even exist (or
doesn't exist in your current directory)?
Next:
> In addition, I am planning to
Hi all,
In addition, I am planning to do a multiple imputation with MICE package
using the data read by bigmemory package.
So usually, the multiple imputation code is like this:
> imp=mice(data.frame,m=50,seed=1234,print=F)
the data.frame is required. How can I change the big.matrix class
g
Hi all,
I have a question about using bigmemory package.
Here is my code:
>
x=read.big.matrix("acc3.dat",backingfile="acc3.bin",descriptorfile="acc3.desc",type="double")
Error in filebacked.big.matrix(nrow = nrow, ncol = ncol, type = type, :
A big.matrix must have at least one row and one
Hi, all,
I have a really big matrix that I want to run k-means on.
I tried:
>data <-
read.big.memory('mydata.csv',type='double',backingfile='mydata.bin',descriptorfile='mydata.desc')
I'm using doMC to register multicore.
>library(doMC)
>registerDoMC(cores=8)
>ans<-bigkmeans(data,k)
In system moni
Thanks again for your help.
I've been able to add several packages, bigmemory seems to be the only one
to fail and it
fails on isinf.
Is there a way I can download the code and change it to include a ininf
function or definition?
I'm using the GNU compiler; should I have been using the SUN St
At one point we might have gotten something working (older version?) on
Solaris x86, but were never successful on Solaris sparc that I remember --
it isn't a platform we can test and support. We believe there are problems
with BOOST library compatibilities.
We'll try (again) to clear up the other
By far the easiest way to achieve this would be to use the bigmemory
C++ structures in your program itself. However, if you do something
on your own (but fundamentally have a column-major matrix in shared
memory), it should be possible to play around with the pointer with
R/bigmemory to
Hi,
Is it possible for me to read data from shared memory created by a vc++ program
into R using bigmemory?
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read th
elf:
>
> x <- big.matrix(nrow=100, ncol=10, ... other options .)
>
> Make sure it works, then increase the size until you get a failure. This
> sort of exercise is extremely helpful in situations like this.
>
> Jay
>
>
> Subject: [R] Bigmemory: Error Run
ize until you get a failure. This
sort of exercise is extremely helpful in situations like this.
Jay
Subject: [R] Bigmemory: Error Running Example
Message-ID:
>
Content-Type: text/plain
Hi,
I am trying to run the bigmemory example provided on the
http://www.bigmemory.org/
The exa
Hi,
I am trying to run the bigmemory example provided on the
http://www.bigmemory.org/
The example runs on the "airline data" and generates summary of the csv
files:-
library(bigmemory)
library(biganalytics)
x <- read.big.matrix("2005.csv", type="integer", header=TRUE,
backingfile="airline.bin",
I am experiencing a problem trying to use a method from the bigmemory.
Specifically, I am using attach.big.matrix but the problem manifests when
trying to call attach.resource. My environment is as follows:
> sessionInfo()
R version 2.10.1 (2009-12-14)
x86_64-pc-linux-gnu
locale:
[1] LC_CTYPE=
Jay, thanks a bunch. New package seems to work just fine and great
improvement in docs by the way:). I tried the same example, new version
deals with it smoothly. In terms of usefulness of my sample code -- sure i
am writing same stuff to disk many times with only one handle -- it was some
toy cod
ldn't be a problem with the new version.
The CRAN update should take place early next week, along with some revised
documentation.
Regards,
Jay
---
Message: 125
Date: Fri, 23 Apr 2010 13:51:32 -0800 (PST)
From: zerdna
To: r-help@r-project.org
Subject: [R] bigmemory package woe
I have pretty big data sizes, like matrices of .5 to 1.5GB so once i need to
juggle several of them i am in need of disk cache. I am trying to use
bigmemory package but getting problems that are hard to understand. I am
getting seg faults and machine just hanging. I work by the way on Red Hat
Linu
. skewness)
>>> at the moment you should just extract a single column (variable) at a
>>> time into R, study it, then get the
>>> next column, etc... . We will not be implementing all of R's
>>> functions directly with big.matrix objects.
>>> We will be cr
ge.
Feel free to email us directly with bugs, questions, etc...
Cheers,
Jay
--
From: utkarshsinghal
Date: Tue, Jun 2, 2009 at 8:25 AM
Subject: [R] bigmemory - extracting submatrix from big.matrix object
To: r help
I am using the library(bi
etc...
Cheers,
Jay
--
From: utkarshsinghal
Date: Tue, Jun 2, 2009 at 8:25 AM
Subject: [R] bigmemory - extracting submatrix from big.matrix object
To: r help
I am using the library(bigmemory) to handle large datasets, say 1 GB,
and facing following problems. Any hints from an
I am using the library(bigmemory) to handle large datasets, say 1 GB,
and facing following problems. Any hints from anybody can be helpful.
_Problem-1:
_
I am using "read.big.matrix" function to create a filebacked big matrix
of my data and get the following warning:
> x =
read.big.matrix("
Hello,
I am running into memory boundaries and would like to try to make use of the
bigmemory (or any other memory enabling) library.
Can anyone help with suggestions as to how this might work?
> library(reshape)
> s <- melt( d[,1:62], id=c(1) )
Error: cannot allocate vector of size 16.0 Mb
>
33 matches
Mail list logo