Hi, I am new to python. I am working in computational biology and I have to deal with text files of huge size. I know how to read line by line from a text file. I want to know the best method in *python3* to load the enire file into ram and do the operations.(since this saves time) I am currently using this method to load my text file:
*f = open("output.txt")content=io.StringIO(f.read())f.close()* But I have found that this method uses 4 times the size of text file.( if output.txt is 1 gb total ram usage of the code is approx 3.5 gb :( ). Kindly suggest me a better way to do this. Working on Python 3.3.1,ubuntu 13.04(Linux 3.8.0-29-generic x64) Thanks -- *AMAL THOMAS *
_______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor