If all your entries are double precision then you are using 8 bytes per entry, so 20,000*n entries are just 160,000*n bytes, i.e. less than 160*n Kb. If your n is 100 you get 16 Mb which is not that much (especially if you pre-allocate it only once). So just use the matrix and don't worry!
--- dxc13 <[EMAIL PROTECTED]> wrote: > > useR's, > > I am writing a program in which the input can be > multidimensional. As of > now, to hold the input, I have created an n by m > matrix where n is the > number of observations and m is the number of > variables. The data that I > could potentially use can contain well over 20,000 > observations. > > Can a simple matrix be used for this or would it be > better and more > efficient to create an external database to hold the > data. If so, should > the database be created using C and how would I do > this (seeing as that I > have never programmed in C)? > > Any help would be greatly appreciated. Thank you > > Derek > -- > View this message in context: > http://www.nabble.com/creating-a-database-tp14375875p14375875.html > Sent from the R help mailing list archive at > Nabble.com. > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, > reproducible code. > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.