Are you running the 32-bit or 64-bit version of R? The 32-bit version
cannot allocate that much space; on Windows, the maximum contiguous space
that can ever be allocated in a 32-bit process is a little over 1Gbyte, on
Unix it's larger but cannot go over the 32-bit address space limit of
4Gbytes.
I dont think so. I removed all variables except for the data I was to use
and tried gc() to release some memories. But the error still happened.
Regards,
Jasmine
On 10 Mar, 2015 10:49 pm, "Uwe Ligges"
wrote:
>
>
> On 10.03.2015 04:16, 李倩雯 wrote:
>
>> Hi all,
>>
>> *Problem Description*
>> I enco
On 10.03.2015 04:16, 李倩雯 wrote:
Hi all,
*Problem Description*
I encountered the *Error: cannot allocate vector of size 64.0 Mb* when I
was using read.zoo to convert a data.frame called 'origin' to zoo object
named 'target'
*About the Data & Code*
My data frame(origin) contains 5340191 obs. of
Hello,
It is not the right way to read a NetCDF file (according to the
extension) in R. Please have a look at the "ncdf4" package. The
"raster" package is also able to read this kind of files.
Regards,
Pascal
On Fri, Mar 21, 2014 at 1:25 AM, eliza botto wrote:
> Dear R family,
> I am trying to
5.2 won't go into 4 but there may be more problems.
32-bit or 64 bit operating system?
RAM is cheap but will your motherboard support more than 4 GB?
And don't forget there are other processes that need to run while you are
using R.
Clint BowmanINTERNET: cl...@ecy.wa
On 22/08/13 21:57, Michael Weylandt wrote:
On Aug 22, 2013, at 7:39, Ben Harrison wrote:
No idea about the problem specifics but what are your OS and version of R? You
might be limited there.
I have 64-bit Ubuntu 12.04, R version 3.0.1.
More likely, however, is that your problem is j
On Aug 22, 2013, at 7:39, Ben Harrison wrote:
> I have a 70363 x 5 double matrix that I am playing with.
>
> > head(df)
>GR SP SN LN NEUT
> 1 1.458543 1.419946 -0.2928088 -0.2615358 -0.5565227
> 2 1.432041 1.418573 -0.2942713 -0.2634204 -0.5927334
> 3 1.4066
On 13-06-14 7:02 PM, Dan Keshet wrote:
I am using xtable version 1.7-1 built for R 3.0.1 on:
R version 3.0.1 (2013-05-16)
Platform: i686-pc-linux-gnu (32-bit)
Sometimes, not every time, when I load xtable or attempt to load the
help, I get an error such as this "Error: cannot allocate vector of
Hi Arun,
But I am using Windows(XP).
From: arun kirshna [via R]
[mailto:ml-node+s789695n4639435...@n4.nabble.com]
Sent: Tuesday, August 07, 2012 10:49 PM
To: Akkara, Antony (GE Energy, Non-GE)
Subject: Re: ERROR : cannot allocate vector of size (in MB & GB)
HI,
If you are using linux
How is possible to split a .csv file in terms of size (in KiloByte) ?
-Original Message-
From: jim holtman [mailto:jholt...@gmail.com]
Sent: Tuesday, July 24, 2012 11:30 PM
To: Akkara, Antony (GE Energy, Non-GE)
Cc: r-help@r-project.org
Subject: Re: [R] ERROR : cannot allocate vector
Thank you Jim. Its working fine !.
Thanks a lot.
- Antony.
-Original Message-
From: jim holtman [mailto:jholt...@gmail.com]
Sent: Tuesday, July 24, 2012 11:30 PM
To: Akkara, Antony (GE Energy, Non-GE)
Cc: r-help@r-project.org
Subject: Re: [R] ERROR : cannot allocate vector of size (in
HI,
You can try like using dbLoad() from hash package to load. Also, if you need
to chunk the data, you can use ff package.
A.K.
- Original Message -
From: Rantony
To: r-help@r-project.org
Cc:
Sent: Tuesday, July 24, 2012 9:45 AM
Subject: [R] ERROR : cannot allocate vector of siz
However, this wouldn't help much with Win XP, as this only allows for 2GB
(maximum of 3 GB):
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
If you want to use more RAM with windows you need to use a 64bit Version.
Cheers,
Henrik
Am 2
Sure, get more RAM. 2GB is a tiny amount if you need to load files of
1GB into R, and as you've discovered won't work.
You can try a few simpler things, like making sure there's nothing
loaded into R except what you absolutely need.
It looks like there's no reason to read the entire file into R a
try this:
input <- file("yourLargeCSV", "r")
fileNo <- 1
repeat{
myLines <- readLines(input, n=10) # 100K lines / file
if (length(myLines) == 0) break
writeLines(myLines, sprintf("output%03d.csv", fileNo))
fileNo <- fileNo + 1
}
close(input)
On Tue, Jul 24, 2012 at 9:45 AM, R
As the error message suggests, see ?memory.size, and you'll find that the
problem is arising because R is running out of memory. If you were able to
run this analysis before, then one possible reason why it now fails is that
the workspace has increased in size in the interim - more objects and
resu
You probably have more objects in your workspace than you did
previously. Clean them out (or just use a new R session) and things
should go back to normal.
You might also want to follow up on the help(memory.size) hint though
-- doesn't Windows impose a memory limit unless you ask it for more?
Mi
Paul
I tested your suggestion of use the BIGLM package. With this package, model
run with out any problem.
regards
2011/9/22 David Winsemius
>
> On Sep 22, 2011, at 5:00 PM, Mario Montecinos Carvajal wrote:
>
> Michale and Paul
>>
>> Thanks for your quick response.
>>
>> Michael, I am runnin
David
Thanks for the time that you spent in read and understand my mail, as well
as for your response and recomendations. I apologize if my attempt to put
comment in my code, not was enought.
I appreciate a lot your suggestion and I will take care of change the
variable name "length" for avoid co
On Sep 22, 2011, at 5:00 PM, Mario Montecinos Carvajal wrote:
Michale and Paul
Thanks for your quick response.
Michael, I am running a 32bit version of R
sessionInfo()
R version 2.11.1 (2010-05-31)
i386-pc-mingw32
Paul, the dimension of the Data Frame with I am workis is
dim(d)
[1] 7017
Michale and Paul
Thanks for your quick response.
Michael, I am running a 32bit version of R
> sessionInfo()
R version 2.11.1 (2010-05-31)
i386-pc-mingw32
Paul, the dimension of the Data Frame with I am workis is
dim(d)
> [1] 7017411
And the size of the file that contains the data is 2946
On 09/22/2011 04:00 AM, R. Michael Weylandt
wrote:
> Are you running a 32bit or 64bit version of R? Type sessionInfo() to see.
>
> Michael
...in addition, how large is your dataset? Please provide us with a self
contained example which reproduces this problem. You could take a look
at the biglm
Are you running a 32bit or 64bit version of R? Type sessionInfo() to see.
Michael
On Sep 21, 2011, at 10:41 PM, Mario Montecinos Carvajal
wrote:
> Hi
>
> I am a new user of the mail list.
>
> Mi problem occurs when I try to test a lineal model (lm), becouse appear the
> messaje "Error: cann
Thank you for replying. when I've tried to run the R syntax in a 64 bit
computer,the problem is solved. Thank you for helping out. I totally agree
your advice.
I would like to answer all your questions in case other people meet the same
problem. The data contains one timestamp column with time zo
Assuming that your column are numeric, you would need 4GB of memory
just to store one copy of the object. If this is 5 years, then you
would need almost 1GB for a copy, but the processing probably will use
up twice as much as it is processing. Try reading a month's worth and
see how much you use.
Thank you Jholtman.
Now count is 46001902. I was trying to retrieve one-year data, but I still
receive the following message:
"Error: cannot allocate vector of size 64.0 Mb"
--
View this message in context:
http://r.789695.n4.nabble.com/Error-cannot-allocate-vector-of-size-tp3629384p3631354.html
select count(*) from yourData
On Tue, Jun 28, 2011 at 3:07 PM, xin123620 wrote:
> Thank you Jeff. You are absolutely right. I just edited the R and computer
> info in: R is 32 bit; computer is his computer is Windows XP, 32bit Intel(R)
> Core(TM) e8...@3.ghz, 2.99GHz, 2.95GB of RAM.
>
> The data
Thank you Jeff. You are absolutely right. I just edited the R and computer
info in: R is 32 bit; computer is his computer is Windows XP, 32bit Intel(R)
Core(TM) e8...@3.ghz, 2.99GHz, 2.95GB of RAM.
The data I am trying to retrieve is through postgre from a university
server. I checked the postgre
A) You haven't mentioned your OS which indicates you haven't followed the
posting guide noted at the bottom of each email.
B) You cannot load an "unknown" number of rows... although you may not specify
the number, it is finite and its value can be determined for the purposes of
debugging your i
On 25.03.2011 23:32, mipplor wrote:
i run a model ,but i turn out to be like this. but i have run this model days
ago and it works well
whats going on here? any suggestion.
If it worked exactly the way before on the same machine, you probably
have too huge objects in your workspace.
Uwe
Without more detailed information I would say that R runs out of
memory...and furthermore:
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
cheers,
Paul
On 03/25/2011 11:32 PM, mipplor wrote:
i run
On Tue, 23 Nov 2010, derek eder wrote:
Hello,
I am facing the dreaded "Error: cannot allocate vector of size x Gb" and
don't understand
enough about R (or operating system) memory management to diagnose and solve
the problem
-- despite studying previous posts and relevant R help -- e.g.:
"E
On 23.11.2010 09:26, derek eder wrote:
Hello,
I am facing the dreaded "Error: cannot allocate vector of size x Gb" and
don't understand
enough about R (or operating system) memory management to diagnose and
solve the problem
-- despite studying previous posts and relevant R help -- e.g.:
"Err
Thank you all for the suggestions. We do intend to get more RAM space.
Meanwhile I shall have a look at the ShortRead package features.
--
View this message in context:
http://r.789695.n4.nabble.com/Error-cannot-allocate-vector-of-size-X-0-Mb-tp2539031p2540518.html
Sent from the R help mailing
On 09/14/2010 08:02 AM, Marc Schwartz wrote:
> On Sep 14, 2010, at 9:47 AM, John1983 wrote:
>
>>
>> Yes I see. So I typed as you mentioned and I get an 8 (therefore
>> this is a 64-bit R).
>>
>> Is there anything else I need to check to remove this error?
>
>
> 1. Add more RAM.
>
> 2. Dependi
On Sep 14, 2010, at 9:47 AM, John1983 wrote:
>
> Yes I see.
> So I typed as you mentioned and I get an 8 (therefore this is a 64-bit R).
>
> Is there anything else I need to check to remove this error?
1. Add more RAM.
2. Depending upon what you are doing relative to data management/analysis
On 14.09.2010 16:47, John1983 wrote:
Yes I see.
So I typed as you mentioned and I get an 8 (therefore this is a 64-bit R).
Is there anything else I need to check to remove this error?
Yes: If the amount of RAM in your machine is sufficient
Best,
Uwe
___
Yes I see.
So I typed as you mentioned and I get an 8 (therefore this is a 64-bit R).
Is there anything else I need to check to remove this error?
--
View this message in context:
http://r.789695.n4.nabble.com/Error-cannot-allocate-vector-of-size-X-0-Mb-tp2539031p2539078.html
Sent from the R h
On Sep 14, 2010, at 9:09 AM, John1983 wrote:
>
> Hi,
>
> I am working with a file (900MB in size) that has around 10 million records
> (in particular FASTQ records).
> I am able to read in the file as an object of BStringSet. When I start to
> manipulate the data, after almost 4 hours, I get the
2010/8/31 ëì¬ë
>
>Hi, All
>
> I have a problem of R memory space.
>
> I am getting "Error: cannot allocate vector of size 198.4 Mb"
>
>
>
You may want to check circle 2 of the R inferno (found here:
http://www.burns-stat.com/pages/Tutor/R_inferno.pdf).
[[alternative HTML ver
Hi,
On Mon, Aug 30, 2010 at 9:17 PM, 나여나 wrote:
>
> Hi, All
>
> I have a problem of R memory space.
>
> I am getting "Error: cannot allocate vector of size 198.4 Mb"
It's a RAM thing: you don't have enough.
The OS said "nice try" when R tried asked for that last 198.4 MB's of
RAM chunk o
On Aug 5, 2010, at 3:53 AM, Ralf B wrote:
I am dealing with very large data frames, artificially created with
the following code, that are combined using rbind.
When running memory.limit() I am getting this:
memory.limit()
[1] 2047
Which shows me that I have 2 GB of memory available. What
Hi
I am not an expert in such issues (never really run into problems with
memory size).
>From what I have read in previous posts on this topic (and there are
numerous) the simplest way would be to go to 64 byte system (Linux, W
Vista, 7), where size of objects is limited by amount of memory o
Thank you for such a careful and thorough analysis of the problem and
your comparison with your configuration. I very much appreciate.
For completeness and (perhaps) further comparison, I have executed
'version' and sessionInfo() as well:
> version
_
platform i386-pc-mingw32
On Thu, Aug 05, 2010 at 03:53:21AM -0400, Ralf B wrote:
> > a <- rnorm(500)
> Error: cannot allocate vector of size 38.1 Mb
>
> When running memory.limit() I am getting this:
>
> memory.limit()
> [1] 2047
>
> Which shows me that I have 2 GB of memory available. What is wrong?
> Shouldn't 38
"An" R script is apparently either working on a big dataset or wasting
memory. What script? What dataset? How much is your current memory
limit? How much did you try to increase it?
On Fri, Jun 18, 2010 at 7:20 PM, harsh yadav wrote:
> PLEASE do read the posting guide http://www.R-project.org/p
Hi!
Thanks for your reply! After running the command below I am certain I am
using a 64-bit R. I am running R through a linux cluster system where R is
globally available for all users. I have asked the system administrators if
they would update their version R but they are not receptive of maki
At first, I'd try with an R version from 2010 rather than one from 2007.
Next, I'd try to be sure to really have a 64-bit version of R rather
than a 32 bit one which is what I suspect.
Best,
Uwe Ligges
On 20.05.2010 20:10, Yesha Patel wrote:
I've looked through all of the posts about this is
On Wed, 11 Nov 2009, Larry Hotchkiss wrote:
Hi,
I'm responding to the question about storage error, trying to read a 300 x
100 dataset into a data.frame.
I wonder whether you can read the data as strings. If the numbers are all one
digit, each cell would require just 1 byte instead of 8.
Hi,
I'm responding to the question about storage error, trying to read a 300 x
100 dataset into a data.frame.
I wonder whether you can read the data as strings. If the numbers are all one
digit, each cell would require just 1 byte instead of 8. That makes 300MB
instead of 2.4GB. You can ru
Hi Peng,
the major problem about your specific case is that when creating the
final object, we need to set dimnames() appropriately. This triggers a
copy of the object and that's where you get the error you describe.
With the current release, unfortunately, there isn't much to do
(unless
For me with ff - on a 3 GB notebook - 3e6x100 works out of the box even without
compression: doubles consume 2.2 GB on disk, but the R process remains under
100MB, rest of RAM used by file-system-cache.
If you are under windows, you can create the ffdf files in a compressed folder.
For the rando
Cool! Thanks for the sampling and ff tips! I think I've figured it out now
using sampling...
I'm getting a quad-core, 4GB RAM computer next week, will try it again using
a 64 bit version :)
Thanks for your time!!!
Maja
tlumley wrote:
>
> On Tue, 10 Nov 2009, maiya wrote:
>
>>
>> OK, it's
On Tue, 10 Nov 2009, maiya wrote:
OK, it's the simple math that's confusing me :)
So you're saying 2.4GB, while windows sees the data as 700KB. Why is that
different?
Your data are stored on disk as a text file (in CSV format, in fact), not as
numbers. This can take up less space.
And let
Check out:
http://www.mail-archive.com/r-h...@stat.math.ethz.ch/msg79590.html
for sampling a large file.
On Tue, Nov 10, 2009 at 8:32 AM, maiya wrote:
>
> OK, it's the simple math that's confusing me :)
>
> So you're saying 2.4GB, while windows sees the data as 700KB. Why is that
> different?
>
maiya wrote:
OK, it's the simple math that's confusing me :)
So you're saying 2.4GB, while windows sees the data as 700KB. Why is that
different?
700_MB_, I assume!
In a nutshell, a single column and a spacer takes 2 bytes per subject,
but a floating point variable takes 8, and R is not good
OK, it's the simple math that's confusing me :)
So you're saying 2.4GB, while windows sees the data as 700KB. Why is that
different?
And lets say I could potentially live with e.g. 1/3 of the cases - that
would make it .8GB, which should be fine? But then my question is if there
is any way to sa
A little simple math. You have 3M rows with 100 items on each row.
If read in this would be 300M items. If numeric, 8 bytes/item, this
is 2.4GB. Given that you are probably using a 32 bit version of R,
you are probably out of luck. A rule of thumb is that your largest
object should consume at m
ok, i'll take a look at this and get back to you during the week. b
On Nov 7, 2009, at 1:19 PM, Peng Yu wrote:
Most of the 8GB was available, when I run the code, because R was the
only computation session running.
On Sat, Nov 7, 2009 at 7:51 AM, Benilton Carvalho
wrote:
you haven't answere
Most of the 8GB was available, when I run the code, because R was the
only computation session running.
On Sat, Nov 7, 2009 at 7:51 AM, Benilton Carvalho wrote:
> you haven't answered how much resource you have available when you try
> reading in the data.
>
> with the mouse exon chip, the math i
you haven't answered how much resource you have available when you try
reading in the data.
with the mouse exon chip, the math is the same i mentioned before.
having 8 GB, you should be able to read in 70 samples of this chip. if
you can't, that's because you don't have enough resources when
On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho wrote:
> this is converging to bioc.
>
> let me know what your sessionInfo() is and what type of CEL files you're
> trying to read, additionally provide exactly how you reproduce the problem.
Here is my sessionInfo(). pname is 'moex10stv1cdf'.
>
t;high"),ordered=T)
poacc2 <- accumcomp(PoCom, y=PoEnv, factor="test", method="exact")
works like a charm for me.
In the future, check carefully which type of arguments are asked for,
and use the function str() to check if they really are what you think
they are.
Kind re
Hi Joris,
thanks for spotting that one. This little mistake has gotten in when I was
trying desperate things with the analysis (factor1 is used in diversitycomp).
Nevertheless, here is the result:
> poacc2 <- accumcomp(PoCom, y=PoEnv, factor="HM_sprem", method="exact")
Error in if (p == 1) { : ar
Hi Roman,
that throws a different light on the problem. It goes wrong from the
start, so it has little to do with the bootstrap or jackknife
procedures. R.huge won't help you either.
Likely your error comes from the fact that "factor1" is not an
argument of the function accumcomp. the argument is
Hello joris,
this is the traceback() output. Hopefully you can make some sense out of it.
Thank you for the tips as well (R.huge looks promising)!
> traceback()
7: vector("integer", length)
6: integer(nbins)
5: tabulate(bin, pd)
4: as.vector(data)
3: array(tabulate(bin, pd), dims, dimnames = dn)
Dear Roman,
could you give us the trace given by traceback() ? I suspect the error
is resulting from the permutations and/or jackknife procedure in the
underlying functions specaccum and specpool.
You can take a look at the package R.huge, but that one is deprecated
already. There are other packa
R-helpers,
I thank Jonathan Greenberg and David Winsemius for their responses. I
will keep R64.app in mind but I found that by deleting some large
objects that I didn't need I was able to do my computations using R
2.9.1. (This is consistent with Winsemius's experiment on a 10GB
machine
On Jul 1, 2009, at 4:43 PM, Jonathan Greenberg wrote:
By the way, you'll probably have to reinstall some or all of your
packages (and dependencies) if you are using R64.app, probably
downgrading them in the process.
--j
This really ought to be on the r-sig-mac list. I am copying that lis
By the way, you'll probably have to reinstall some or all of your
packages (and dependencies) if you are using R64.app, probably
downgrading them in the process.
--j
Steve Ellis wrote:
Dear R-helpers,
I am running R version 2.9.1 on a Mac Quad with 32Gb of RAM running
Mac OS X version 10.5.
Steve:
Are you running R64.app? If not, grab it from here:
http://r.research.att.com/R-2.9.0.pkg
(http://r.research.att.com/ under "Leopard build") .
As far as I know (and I actually just tried it this morning), the
standard R 2.9.1 package off the CRAN website is the 32 bit version,
>What are you going to do with an agglomerative hierarchical clustering of
22283 objects? It will not be interpretible.
As a matter of fact I was ask to do a clustering analysis on gene
expression. Something
http://www.ncbi.nlm.nih.gov/projects/geo/gds/analyze/analyze.cgi?datadir=UCorrelationUP
On Mon, 22 Dec 2008, iamsilvermember wrote:
dim(data)
[1] 2228319
dm=dist(data, method = "euclidean", diag = FALSE, upper = FALSE, p = 2)
Error: cannot allocate vector of size 1.8 Gb
That would be an object of size 1.8Gb.
See ?"Memory-limits"
Hi Guys, thank you in advance for h
I hope the following info will help, thanks again!
> sessionInfo()
R version 2.7.1 (2008-06-23)
x86_64-redhat-linux-gnu
locale:
LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TE
ram basnet wrote:
Dear R users,
I am using RandomForest package. While using this package, i got
"Error: cannot allocate vector of size 117.3 Mb" .message.
I had this problem earlier too but could not manage. Is there any way to solve
this problem or to increase vec
Dear John,
note that you "Date" is a factor rather than some date object.
If you convert it to some date object, just a few megabytes will suffice!
Best wishes,
Uwe
John wrote:
> On Friday 28 March 2008 14:28, Daniel Nordlund wrote:
>>> -Original Message-
>>> From: [EMAIL PROTECTED] [
On Friday 28 March 2008 14:28, Daniel Nordlund wrote:
> > -Original Message-
> > From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
> > On Behalf Of John
> > Sent: Friday, March 28, 2008 12:04 PM
> > To: r-help@r-project.org
> > Subject: [R] Error: cannot allocate vector of size 3.0 Gb
> >
>
Can you make the data.frame available somewhere. Actually, I am
surprised it need that huge amount of memory to do the plot.
Best,
Uwe Ligges
John wrote:
> Hello,
>
> I have read recent posts on this topic (Dr. Ronnen Levinson's Monday 02:39:55
> pm), but before I install a 64 bit system, and
> -Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf
> Of John
> Sent: Friday, March 28, 2008 12:04 PM
> To: r-help@r-project.org
> Subject: [R] Error: cannot allocate vector of size 3.0 Gb
>
> Hello,
>
> I have read recent posts on this topic (Dr. Ronnen Lev
Looks like you attach() the data frame before you try t/o plot. Note
that in the "Details" section of ?attach, it says:
"... The database is not actually attached. Rather, a new environment is
created on the search path and the elements of a list (including columns
of a data frame) or objects in
Rod wrote:
> On Jan 8, 2008 3:40 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
>
>> On 1/8/2008 8:49 AM, Rod wrote:
>>
>>> On Jan 8, 2008 12:41 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
>>>
Rod wrote:
> Hello,
>
> I have a memory problem when I run
On Jan 8, 2008 3:40 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
>
> On 1/8/2008 8:49 AM, Rod wrote:
> > On Jan 8, 2008 12:41 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
> >>
> >> Rod wrote:
> >> > Hello,
> >> >
> >> > I have a memory problem when I run package WinBUGS with R (2.6.1).
> >> > Unt
On 1/8/2008 8:49 AM, Rod wrote:
> On Jan 8, 2008 12:41 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
>>
>> Rod wrote:
>> > Hello,
>> >
>> > I have a memory problem when I run package WinBUGS with R (2.6.1).
>> > Until now I have been using a Pentium IV 3.2Ghz computer with 512Mb of
>> > RAM memory
On Jan 8, 2008 12:41 PM, Duncan Murdoch <[EMAIL PROTECTED]> wrote:
>
> Rod wrote:
> > Hello,
> >
> > I have a memory problem when I run package WinBUGS with R (2.6.1).
> > Until now I have been using a Pentium IV 3.2Ghz computer with 512Mb of
> > RAM memory (with Windows XP Pro SP2), and I hadn't h
Rod wrote:
> Hello,
>
> I have a memory problem when I run package WinBUGS with R (2.6.1).
> Until now I have been using a Pentium IV 3.2Ghz computer with 512Mb of
> RAM memory (with Windows XP Pro SP2), and I hadn't had any problem.
> Now I have a new computer with the following characteristics: I
Maura E Monville wrote:
> I read the subject message in a number of R archived emails.
> Since I am experiencing the same problem:
>
>> upfmla
> A ~ T + cosP + cos2P + cos4P + cos5P + sin3P + sin5P + cosP2 +
> sinP3 + P2
>> glmod <- gls(upfmla,correlation=corAR1(),method="ML")
> Error: canno
Jittima Piriyapongsa <[EMAIL PROTECTED]> writes:
> Hi,
>
> I want to change .RDA file to a text file. So I did as follows.
>
>>load("my.rda")
>>ls() ---> then it showed [1] exprs
>>write.table(exprs,"C:\\my.txt",sep="\t")
>
> I was successful with the first .RDA file. Then I used the same
> comman
87 matches
Mail list logo