Hi Amit,
Is the file gzipped or extracted?
if you read the plain text file, try to gzip it and make a read.table on the
gzipped file, the read.table can handle gzipped files at least on linux and mac
OS, not sure about windows.
cheers
Peter
> On 2. May 2017, at 18:59, Amit Sengupta via R-hel
Hi,I was unable to read a 2.4 gig file into an R object using read.table in 64
bit R environment. Please let me have your suggestions.Amit Sengupta
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and
On Windows 32-bit I think (it's been a while) you can push it to 3 GB but
to go beyond you need to run R on 64-bit Windows (same rule for all
software not just R). I'm pretty sure this is already documented in the R
documentation.
Henrik
On Nov 22, 2016 19:49, "Ista Zahn" wrote:
Not convenient
Ah, you also need to use a 64-bit operating system. Depending on the age of
your hardware this may also mean you need a new computer.
There are ways to process data on disk for certain algorithms, but you will be
glad to leave them behind once the opportunity arises, so you might as well do
so
Not conveniently. Memory is cheap, you should buy more.
Best,
Ista
On Nov 22, 2016 12:19 PM, "Partha Sinha" wrote:
> I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
> more than 2 Gb data set ?
>
> Regards
> Partha
>
> [[alternative HTML version deleted]]
>
> ___
Yes.
If you cannot read the dataset with the usual means, using functions like
read.table or read.csv, try the ff package: https://cran.r-
project.org/web/packages/ff/index.html.
Best,
On Tue, Nov 22, 2016 at 2:16 PM, Partha Sinha wrote:
> I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is
Depends how you use it. e.g. it can be stored on disk and worked with
in pieces. Or some packages work with virtual memory, I believe.
However, it is certainly not possible to read it into R. In fact, you
probably won't be able to handle more (and maybe much less) than about
500 mb in R.
Cheers,
I am using R 3.3.2 on win 7, 32 bit with 2gb Ram. Is it possible to use
more than 2 Gb data set ?
Regards
Partha
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mai
Dear Sir,
Yes I am using the plyr and in the end I am writing the output to the
data.frame. Earlier I had the problem of process time and hence I made some
changes in the code and now I am fetching all the required inputs needed for
valuation purpose using ddply, store the results in a data.fr
Dear Sir,
Thanks for the guidance. Will check. And yes, at the end of each simulation, a
large result is getting stored.
Regards
Amelia
On Wednesday, 6 April 2016 5:48 PM, jim holtman wrote:
It is hard to tell from the information that you have provided. Do you have a
list of the siz
As Jim has indicated, memory usage problems can require very specific
diagnostics and code changes, so generic help is tough to give.
However, in most cases I have found the dplyr package to be more memory
efficient than plyr, so you could consider that. Also, you can be explicit
about only s
You say it is "getting stored"; is this in memory or on disk? How are you
processing the results of the 1,000 simulations?
So some more insight into the actual process would be useful. For example,
how are the simulations being done, are the results stored in memory, or
out to a file, what are y
It is hard to tell from the information that you have provided. Do you
have a list of the sizes of all the objects that you have in memory? Are
you releasing large objects at the end of each simulation run? Are you
using 'gc' to garbage collect any memory after deallocating objects?
Collect some
Dear R Forum,
I have about 2000+ FX forward transactions and I am trying to run 1000
simulations. If I use less no of simulations, I am able to get the desired
results. However, when I try to use more than 1000 simulations, I get following
error.
> sorted2 <- ddply(sorted, .(currency_from_exch
i didn't write them because I thought it would be long. I am using
HPbayes package. I changed mp8.mle function. Two functions depend on
this one; loop.optim and prior.likewts, so I changed them and rename
them. The memory problem arises when applying the new loop.optim
function named loop.optim_m.
-project.org
> Subject: [R] Memory problem when changing a function
>
> I changed a function in a package and I want to run this new function.
> It always gives the error of "Error in memory: couldn't allocate a
> vector of 15.3 Gb" altough the built in function do
I changed a function in a package and I want to run this new function.
It always gives the error of "Error in memory: couldn't allocate a
vector of 15.3 Gb" altough the built in function doesn't give this
error.
My system is window 10, 8 Ram, AMD Quad-Core processor.
I've read about memory proble
Hello List,
I solved the problem by using the code with 31 votes
http://stackoverflow.com/questions/1358003/tricks-to-manage-the-available-memory-in-an-r-session
On Sat, Jul 13, 2013 at 6:15 AM, Elaine Kuo wrote:
> Hello List,
>
> This is Elaine.
> I am running betadiver for a dataset of 4873
Hello List,
This is Elaine.
I am running betadiver for a dataset of 4873 rows and 2749 columns.
(4873 rows = 4873 gridcell of the study region and 2749 columns for the
bird species)
The dataset was produced by combing 5 dbf.
When running the code o, an error message jumped out, saying
"Error: can
Hi all, I am running an -MNP- multinomial probit model package using R. It
gives me the following objection instead of giving me the results:
Erreur : impossible d'allouer un vecteur de taille 137.9 Mo (in english: cannot
allocate a 137.9 Mb vector memory).
I have already increased the memory s
On Aug 1, 2011, at 3:04 AM, Dimitris.Kapetanakis wrote:
Thanks a lot for the help.
Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel
Leopard
build 32-bit (5751)) but I think I can find access on windows 7 64-
bit.
I don't think that was what Holtman was advising. You just ne
Thanks a lot for the help.
Actually, I am using a mac which (R for Mac OS X GUI 1.40-devel Leopard
build 32-bit (5751)) but I think I can find access on windows 7 64-bit. What
I am trying to do is a maximization through grid search (because I am not
sure that any of the optim() methods works suff
My advice to you is to get a 64-bit version of R. Here is what it
does on my 64-bit Windows 7 version:
> N<-250
> x<-matrix(c(rnorm(N,-1.5,1), rnorm(N,1,1), rbinom(N,1,0.5)), ncol=3)
> my.stats(1)
1 (1) - Rgui : 22:30:20 <0.7 78.6> 78.6 : 20.5MB
> start<-(-1)
> end<-3
> step<-10^(-2)
> n.steps<-(
Dear all,
I am trying to make some matrix operations (whose size I think is smaller
than what R allows) but the operations are not feasible when they run in one
session but it is feasible if they run separately while each operation is
totally independent of the other. I run the code in one session
avsha38 wrote:
Hi,
when i run the following code i get this massege:
"The instruction at 0x reference memory at
0x###, the memory cannot be "read".
and then i have to close R.
what is the problem and how can i solve it?
The problem is a bug in the underlying C (or other) co
Hi,
when i run the following code i get this massege:
"The instruction at 0x reference memory at
0x###, the memory cannot be "read".
and then i have to close R.
what is the problem and how can i solve it?
thanks in advance
Avi
my code
# frailtypack
library(frailtypack)
cgd.ag <
On Jul 28, 2010, at 9:53 AM, Brandon Hurr wrote:
It was my understanding that R wasn't really the best thing for
absolutely
huge datasets. 17.5 million points would probably fall under the
category of
"absolutely huge."
I'm on a little netbook right now (atom/R32) and it failed, but I'll
On 07/28/2010 06:13 AM, Edwin Husni Sutanudjaja wrote:
Dear all,
I have a memory problem in making a scatter plot of my 17.5 million-pair
datasets.
My intention to use the "ggplot" package and use the "bin2d". Please find the
attached script for more details.
Could somebody please give me any c
It was my understanding that R wasn't really the best thing for absolutely
huge datasets. 17.5 million points would probably fall under the category of
"absolutely huge."
I'm on a little netbook right now (atom/R32) and it failed, but I'll try it
on my macbookPro/R64 later and see if it's able to
Dear all,
I have a memory problem in making a scatter plot of my 17.5 million-pair
datasets.
My intention to use the "ggplot" package and use the "bin2d". Please find the
attached script for more details.
Could somebody please give me any clues or tips to solve my problem?? please ...
Just for
On Mon, 5 Jul 2010, Daniel Wiesmann wrote:
Dear All
I am trying to fit a multinomial logistic regression to a data set with
a size of 94279 by 14 entries. The data frame has one "sample" column
which is the categorical variable, and the number of different
categories is 9. The size of the da
Dear All
I am trying to fit a multinomial logistic regression to a data set with a size
of 94279 by 14 entries. The data frame has one "sample" column which is the
categorical variable, and the number of different categories is 9. The size of
the data set (as a csv file) is less than 10 MB.
I
>>> jim holtman 08/02/2010 14:09:52 >>>
>Typically R does not have macros;
I know exactly why Jim Holtman said that; R doesn't have a separate
'macro' construct with separate 'macro variables'.
But it is perhaps a bit misleading to say that R doesn't have macros
without saying a bit more about
What exactly is your definition of "macro"? What to you want to do?
What is the problem that you are trying to solve? Why to you think
macros will help? Typically R does not have macros; I assume that
idea is a holdover from SAS.
On Mon, Feb 8, 2010 at 4:30 AM, Meenakshi
wrote:
>
> Hi,
>
> Can
Hi,
Can I use macro variables in R. If we can use macro variables in R,
where i can get that programs or macro in R books.
--
View this message in context:
http://n4.nabble.com/Memory-Problem-tp1459740p1472700.html
Sent from the R help mailing list archive at Nabble.com.
Have you tried gc() to see if any memory is released? How big was the
file that you read in? I don't see any large objects that appear in
your workspace. Is there some type of processing that you did after
reading in the data? You might want to intersperse the following
command in your script s
This is my objects size:
Size Mode
asa_Condition 912 list
asa_GatedCommunity9,912 list
asa_Neighbourhood 2,872 list
asa_Security832 list
asa_Storeys 800 list
Conditi
Here is a function I use to get the size of the objects in my
workspace. Let us know the output of this command
my.object.size <- function (pos = 1, sorted = F)
{
.result <- sapply(ls(pos = pos, all.names = TRUE), function(..x)
object.size(eval(as.symbol(..x
if (sorted) {
.res
Hi,
After get error message,
My main file size is 1.05MB.
Other objects are within 400bytes only.
Thanks.
--
View this message in context:
http://n4.nabble.com/Memory-Problem-tp1459740p1471153.html
Sent from the R help mailing list archive at Nabble.com.
___
Hi,
I am using R 10.2.1 version.
Before run any statement/functions the gc report is:
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 124352 3.4 35 9.4 35 9.4
Vcells 81237 0.7 786432 6.0 310883 2.4
After I run the repeat statement, I got the following error mes
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
How about providing information on your operating system and version
of R. Also provide a list of all the objects in your workspace and
the size of them.
Hi,
I have to run the repeat loop more than 50 times continuously. But it runs
only 20 to 30 times only. After that the memory problem is coming. My
dataset has 6321kb only. Then how to solve this problem.
Meenakshi
--
View this message in context:
http://n4.nabble.com/Memory-Problem-tp145974
On 02/02/2010 09:33 PM, Meenakshi wrote:
Hi,
When I run the repeat loop in R for large dataset, I got Memory problem.
How can I solve these problem.
1) Wait 2^m years, where m is the power of 2 that approximates the
multiple of your current amount of RAM that would accommodate your
problem
On 02.02.2010 11:33, Meenakshi wrote:
Hi,
When I run the repeat loop in R for large dataset, I got Memory problem.
How can I solve these problem.
buy more memory, bigger machine, more efficient programming, import of
only relevant data, use of specific tools, .. or in other words:
Dep
Hi,
When I run the repeat loop in R for large dataset, I got Memory problem.
How can I solve these problem.
--
View this message in context:
http://n4.nabble.com/Memory-Problem-tp1459740p1459740.html
Sent from the R help mailing list archive at Nabble.com.
___
You were asked to provide details, but so far have not.
--
David.
On Jan 27, 2010, at 2:17 AM, prem_R wrote:
Yes i think this is explanation of the problem faced .Could you
please
help me to solve this .
--
View this message in context:
http://n4.nabble.com/R-Memory-Problem
Yes i think this is explanation of the problem faced .Could you please
help me to solve this .
--
View this message in context:
http://n4.nabble.com/R-Memory-Problem-tp1289221p1311291.html
Sent from the R help mailing list archive at Nabble.com
prem_R writes:
> I'm running predictive analytics using R and to calibrate my model i
> used to adjust the variables used in the model and the problem happens
> here.R just runs out of memory .I tried garbage cleaning also.
I'm analyzing a 8 GB data set using R, so it can certainly handle large
created and run for 50 Iterations
> .Problem occurs here after running few iterations it shows out of space.
>
> I'm using R 2.10.0
>
> If you need any other clarifications i shall provide the needed .Help me to
> solve this
> --
> View this messa
run for 50 Iterations
.Problem occurs here after running few iterations it shows out of space.
I'm using R 2.10.0
If you need any other clarifications i shall provide the needed .Help me to
solve this
--
View this message in context:
http://n4.nabble.com/R-Memory-Problem-tp1289221p1289221.htm
brosi Alessandro
Cc: r-help@r-project.org
Subject: Re: [R] memory problem on Suse
On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:
>
> Dear all, I am meeting some problems with memory allocation. I know
> it is an old issue, I'm sorry.
> I looked for a solution in the FAQs a
Ask on the bioconductpr mailing list, where you will be diirected to
several solutions for analyzing what I guess are 100's is cel files
http://bioconductor.org
--
Martin Morgan
On Dec 11, 2009, at 8:02 AM, Marc Schwartz wrote:
On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:
Dear
On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:
Dear all, I am meeting some problems with memory allocation. I know
it is an old issue, I'm sorry.
I looked for a solution in the FAQs and manuals, mails, but without
finding the working answer.
I really hope you can help me.
For inst
Dear all, I am meeting some problems with memory allocation. I know it is an
old issue, I'm sorry.
I looked for a solution in the FAQs and manuals, mails, but without finding the
working answer.
I really hope you can help me.
For instance, if I try to read micorarray data I get:
> mab=ReadA
> Hi,
>
> yesterday i had the surprise not to be able to load the package "ca" on R
> 2.7.0 saying that cannot find required package rgl although it was there. So
> today i've upgraded to 7.2.1. patched and i got the following error:
>
>> local({pkg <- select.list(sort(.packages(all.available =
On Fri, 21 Mar 2008, Georgios Marentakis wrote:
> Dear all,
> I am having a memory problem when analyzing a rather large data set with
> nested factors in R.
> The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent
> variables some of which are nested.
> The problem occurs when usi
Dear all,
I am having a memory problem when analyzing a rather large data set with
nested factors in R.
The model is of the form X~A*B*(C/D/F) A,B,C,D,F being the independent
variables some of which are nested.
The problem occurs when using aov but also when using glm or lme.
In particular I get th
Dear all,
First I apologize for cross-posting, but I think that this could be of
interest to BioC users, too.
For DNA copy-number analysis I have downloaded PLASQ500K from:
http://genome.dfci.harvard.edu/~tlaframb/PLASQ/
After creating sub-directories SND and STD containing 3 Sty Mapping arrays
Elena,
Page 23 of the R Installation Guide provides some memory guidelines
that you might find helpful.
There are a few things you could try using R, at least to get up and running:
- Look at fewer tumors at a time using standard R as you have been.
- Look at the ff package, which leaves the dat
On Thu, 31 Jan 2008, Eleni Christodoulou wrote:
> Hello R users,
>
> I am trying to run a cox model for the prediction of relapse of 80 cancer
> tumors, taking into account the expression of 17000 genes. The data are
> large and I retrieve an error:
> "Cannot allocate vector of 2.4 Mb". I increase
I have a similar problem, saying "cannot allocate vector size of
300MB". I would also appreciate if someone can offer some suggestion
on this.
Best,
Shige
On Jan 31, 2008 2:48 PM, Eleni Christodoulou <[EMAIL PROTECTED]> wrote:
> Hello R users,
>
> I am trying to run a cox model for the prediction
Hello R users,
I am trying to run a cox model for the prediction of relapse of 80 cancer
tumors, taking into account the expression of 17000 genes. The data are
large and I retrieve an error:
"Cannot allocate vector of 2.4 Mb". I increase the memory.limit to 4000
(which is the largest supported by
Hi All,
There is something I don't quite understand about R memory management.
I have the following function
function (AdGroupId)
{
print(memory.size())
channel <- odbcConnect("RDsn", uid = "", case = "tolower", pwd =
"xx")
Tree1 <- sqlQuery(channel, "exec SelectAdg
I am trying to make a predicted vegetation map using the predict ( )
function and am running into an issue with memory size
Specifically I am building a random forest classification (dataframe = "
vegmap.rf") using the randomForest library and then am trying to apply
results from that to construct
64 matches
Mail list logo