You could break the data into chunks, so you cbind and save 50,000 observations at a time. That should be less taxing on your machine and memory.
-----Original Message----- From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On Behalf Of Mary Kindall Sent: Friday, January 06, 2012 12:43 PM To: r-help@r-project.org Subject: [R] cbind alternate I have two one dimensional list of elements and want to perform cbind and then write into a file. The number of entries are more than a million in both lists. R is taking a lot of time performing this operation. Is there any alternate way to perform cbind? x = table1[1:1000000,1] y = table2[1:1000000,5] z = cbind(x,y) //hanging the machine write.table(z,'out.txt) -- ------------- Mary Kindall Yorktown Heights, NY USA [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code. *************************************************************** This message is for the named person's use only. It may\...{{dropped:11}} ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.