Hello everybody out there using R,

I'm using R for the analysis of biological data and write the results
down using LaTeX, both on a notebook with linux installed.
I've already tried two options for the import of my data:
1. Import from a SQLite database
2. Import from individual csv files edited with sed, awk and sort.
Both methods actually work very well, since I don't need advanced
features like multi-user network access to the data.
My data sets are tables with up to 20 columns and 1000 rows, containing
mostly numerical values and strings. Moreover,
I might also have to handle microarray data, but I'm not so sure about
that yet. Moreover, I need to organise tags for a collection of photos,
but this data is of course not analysed with R.
I'm now beginning to work on a larger project and have to decide,
whether it is better to use SQLite or csv-files for handling my data.
I fear, it might get difficult to switch between the two system after
having accumulated the data, adapted software for backups and revision
control, written makefiles etc.
Could anyone of you give me a hint on the additional benefits of
importing data from a SQLite database to R to the simpler way of
organising the data in csv files? Is it for example possible to select
values from a column within a certain range from a csv file using awk?

Thanks in advance,
Juliet Jacobson

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to