This is the same set of data that I have been working with for those in the know. it is a matrix of ~174 columns and ~70,000 rows. I have it as a zoo object, but I could read it in as just a matrix as long as the date time stamp won't be corrupted.
here is an example of what a column would look like: 1/1/06 12:00, 1, 2, 3, 4, 5, 6, 7, 8, 9, 0, 1, 2, 3, 4, 5,6 ,7, NA #read in with the following thanks to Gabor # chron > library(chron) > fmt.chron <- function(x) { + chron(sub(" .*", "", x), gsub(".* (.*)", "\\1:00", x)) + } > z1 <- read.zoo("all.csv", sep = ",", header = TRUE, FUN = fmt.chron) #this part works just fine and I can plot and analyze data till my hearts content I need to replace NA (~700,000+) with the numeric value 9999 for a beam forming exercise that we are conducting with a geophysicist in matlab. I can't get it into excel in the present form because it is too big. I was wondering if there was an easy way to do such a thing in R? I tried the following: dat <- sapply(z1, function(x) {x[is.na(x)] <- 9999; x}) and I got the following error: Error in array(unlist(answer, recursive = FALSE), dim = c(common.len, : 'dim' specifies too large an array thanks for the help Stephen -- Let's not spend our time and resources thinking about things that are so little or so large that all they really do for us is puff us up and make us feel like gods. We are mammals, and have not exhausted the annoying little problems of being mammals. -K. Mullis ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.