Hi all,

I am having problems importing a VERY large dataset in R. I have looked into
the package ff, and that seems to suit me, but also, from all the examples I
have seen, it either requires a manual creation of the database, or it needs
a read.table kind of step. Being a survey kind of data the file is big (like
20,000 times 50,000 for a total of about 1.2Gb in plain text) the memory I
have isn't enough to do a read.table and my computer freezes every time :( 

This far I have managed to import the required subset of the data by using a
"cheat": I used GRETL to read an equivalent Stata file (released by the same
source that offered the csv file), manipulate it and export it in a format
that R can read into memory. Easy! But I am wondering, how is it possible to
do this in R entirely from scratch?

Thanks
-- 
View this message in context: 
http://www.nabble.com/How-to-import-BIG-csv-files-with-separate-%22map%22--tp24484588p24484588.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to