Hi Arun, The second method is indeed working much faster. It worked fast for my 600.000 row record. Still I have 2 bigger files where processing becomes an issue even though I have lots of memory (32 gig) for the second statement: res2<-reshape(dat2,idvar="newCol",varying=list(2:26),direction="long")
Would data.table also take less memory? Maybe even speed things up would be good. How would I do it? I think splitting the dataframe before merging it might also be an option and after that combining them, any ideas on that? Regards Dirk -- View this message in context: http://r.789695.n4.nabble.com/Create-rows-for-columns-in-dataframe-tp4673607p4673750.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.