On Sun, Jul 18, 2010 at 16:11, Alan Gauld <alan.ga...@btinternet.com> wrote: > "Richard D. Moores" <rdmoo...@gmail.com> wrote > >>>> I earlier reported that my laptop couldn't handle even 800 million. >>> >>> What do you mean, "couldn't handle"? Couldn't handle 800 million of >>> what? Obviously not bytes, >> >> I meant what the context implied. Bytes. Look back in this thread to >> see my description of my laptop's problems. > > But you stored those in a list and then joined the list which meant you > actually at one point had two copies of the data, one in the list and > one in the string - that's >1.6billion bytes. > > And these tests suggest you only get about 2billion bytes of memory > to use which maybe explains why you were pushed to the limit at > 800million.
Ah. Maybe it does. Thanks, Alan. Following Dave Angel's info about the swap file, I took a look at my 64-bit Vista paging file size. The custom setting was 5944 MB. Without really thinking that it would help, I changed the size to 8000 MB and rebooted. I then tried my searching script (<http://tutoree7.pastebin.com/rJ1naffZ>) on the 1 billion random digits file, changing script lines 32 and 48 appropriately. As before, this froze my laptop and I had to do an abrupt, forced shutdown. Don't want to do too many of those. Now, I don't really care about what a billion random digits might contain. The first billion digits of pi, maybe -- but the most I have so far is 100 million. If I can get gmpy to give me 900 million more, then I'll try to figure out how to do your trick of getting one or two small chunks at a time for searching. I do want to learn that stuff (tell, seek, etc.), but I need to follow along in a good elementary book first, I think. Dick _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor