What tools do you like for working with tab delimited text files
up to 1.5 GB (under Windows 7 with 8 GB RAM)?
Standard tools for smaller data sometimes grab all the available
RAM, after which CPU usage drops to 3% ;-)
The "bigmemory" project won the 2010 John Chambers Award but "is
not available (for R version 3.1.0)".
findFn("big data", 999) downloaded 961 links in 437 packages.
That contains tools for data PostgreSQL and other formats, but I
couldn't find anything for large tab delimited text files.
Absent a better idea, I plan to write a function getField to
extract a specific field from the data, then use that to split the data
into 4 smaller files, which I think should be small enough that I can do
what I want.
Thanks,
Spencer
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.