I'm not an expert in its use, but I believe the bigmemory package offers
the functionality you are looking for (or at least similar functionality
that can be co-opted for your use-case). See the sub.big.matrix function.
Depending on what you mean by "huge" it may offer other benefits as well.

HTH,
~G

On Fri, Aug 31, 2012 at 6:47 AM, Damien Georges
<damien.georg...@gmail.com>wrote:

> Hi all,
>
> I'm working with some huge array in R and I need to load several ones to
> apply some functions that requires to have all my arrays values for each
> cell...
>
> To make it possible, I would like to load only a part (for example 100
> cells) of all my arrays, apply my function, delete all cells loaded, loaded
> following cells and so on.
>
> Is it possible to unserialize (or load) only a defined part of an R array ?
> Do you know some tools that might help me?
>
> Finally, I did lot of research to find the way array (and all other R
> object) are serialized into binary object, but I found nothing explaining
> really algorithms involved. If someone has some information on this topic,
> I'm interesting in.
>
> Hoping my request is understandable,
>
> All the best,
>
> Damien.G
>
> ______________________________**________________
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/**listinfo/r-devel<https://stat.ethz.ch/mailman/listinfo/r-devel>
>



-- 
Gabriel Becker
Graduate Student
Statistics Department
University of California, Davis

        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to