Dear R-Listers, I am a Windows user (R 2.6.2) using the development version of sqldf to try to read a 3GB file originally stored in .sas7bdat-format. I convert it to comma-delimited ASCII format with StatTransfer before trying to import just the rows I need into R. The problem is that I get this error:
> f <- file("hugedata.csv") > DF <- sqldf("select * from f where C_OPR like 'KKA2%'", file.format=list(header=T, row.names=F)) Error in try({ : RS-DBI driver: (RS_sqlite_import: hugedata.csv line 1562740 expected 52 columns of data but found 19) Error in sqliteExecStatement(con, statement, bind.data) : RS-DBI driver: (error in statement: no such table: f) Now, I know that my SAS-using colleagues are able to use this file with SAS, so I was wondering whether StatTransfer'ing it to the SAS XPORT format which can be read with the 'read.xport' function in the 'foreign' package would be a better approach. The problem is, I don't know how/whether I can do that at all with sqldf. I tried various ways like f <- file(read.xport("hugedata.xport")) but I consistently got an error message from the sqldf command. I don't recall the exact error message, unfortunately, but can anybody tell me whether it is at all possible to read in files in non-ASCII format without having to put them in R memory? Thank you for your assistance. Peter. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.