Hello Jim,
It sounds like a good time to go read about the packages
bigmemory
and/or
ff
Best,
Tal
Contact
Details:---
Contact me: tal.gal...@gmail.com | 972-52-7275845
Read me: www.talgalili.com (Hebrew) | www.biostatistics.co.
You are trying to create an object with 1G elements. Given that these
are integers, this will require about 4GB of space. If you are
running on a 32-bit system, which has a total phyical limit of 2-3GB
depending on what options you are running (at least on Windows), then
you have exceeded the lim
Might there be a limit ?
> c <- matrix(1:1, ncol=200)
> dim(c)
[1] 50200
> c <- matrix(1:10, ncol=200)
Error: cannot allocate vector of size 3.7 Gb
-
A R learner.
--
View this message in context:
http://r.789695.n4.nabble.com/Increasing-the-maximum-number-of-rows-t
Hi thank you both for your answers.
I did verify that the number of rows in the csv is actually ~207,000,
both in the MySQL output and then directly in the csv file. The line
128328 looks exactly as all others above and below.
I tried the comment.char and quote parameters and it turns out that
You might also try setting the following parameters on read.csv:
comment.char='', quote=''
If you have a "#", this might cause missing data; also an unbalanced
quote will cause missing lines.
On Sat, May 22, 2010 at 2:12 AM, Erik Iverson wrote:
> Alex Ruiz E. wrote:
>>
>> Dear R helpers,
>>
>>
Alex Ruiz E. wrote:
Dear R helpers,
I created a somewhat big database (+206,700 rows) in MySQL and have
exported into a csv file, but I can't open the whole thing in R. I am
using:
base<-read.csv("/path/to/file.csv", header=F, sep="," nrows=206720)
R doesn't complain but it only opens 128,32
Dear R helpers,
I created a somewhat big database (+206,700 rows) in MySQL and have
exported into a csv file, but I can't open the whole thing in R. I am
using:
> base<-read.csv("/path/to/file.csv", header=F, sep="," nrows=206720)
R doesn't complain but it only opens 128,328 observations (the nu
7 matches
Mail list logo