Hi R user,
I was trying to develop a model (logistic regression) for 4001 dependent 
variables using 15 environmental variables (45000 rows); and then trying to use 
the models to predict in future. I used following code but it took so much time 
and consumed 100% of the PC memory. Even though- analysis was not complete. I 
got a following message 
" Reached total allocation of 8098Mb: see help(memory.size)". I increased 
memory size to 8GB. but still I could not complete the analysis. 
Any suggestion to reduce the memory and compute the big data set. 

#------------------------------------------------------------------
data=spec.Env

models <- list()
PredictModelsCur<-list()
PredictModelsA1<-list()
PredictModelsA2<-list()
PredictModelsA3<-list()
dvnames <- paste("V", 2:4003, sep="")
ivnames <- paste("env", 1:15, sep="",collapse="+") ## for some value of n

for (y in dvnames){
     form <- formula(paste(y,"~",ivnames))
     models[[y]] <- glm(form, data=spec.Env, family='binomial')
     PredictModelsCur[[y]]<-predict(models[[y]], type="response")
     PredictModelsA1[[y]]<-predict(models[[y]], data = a1.Futute, 
type="response")
     PredictModelsA2[[y]]<-predict(models[[y]], data = a2.Futute, 
type="response")
     PredictModelsA3[[y]]<-predict(models[[y]], data = a3.Futute, 
type="response")
}

write.csv(PredictModelsCur, "PredictModelsCur.csv", row.names=F)
write.csv(PredictModelsA1, "PredictModelsA1.csv", row.names=F)
write.csv(PredictModelsA2, "PredictModelsA2.csv", row.names=F)
write.csv(PredictModelsA3, "PredictModelsA3.csv", row.names=F)

                                          
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to