At 11:29 AM 10/12/2007, Alan Gauld wrote: >"Dick Moores" <[EMAIL PROTECTED]> wrote > > >>But thats not an efficient approach if you have a big list, the > >>best route is to build a new list, preferably using a list > >>comprehension. > > > > Alan, here's a test I made up. It doesn't show your contention is > > correct, but I imagine you or someone else will show me I don't know > > what I'm doing. :-) > >Efficiency isn't always measured in terms of speed! > >Think about the memory resources. Imagine you have a list >of a few million entries. Now with your approach you make >a copy (doubling the memory usage) then delete the ones >you don't need - possibly most of them. But with a LC you >start with the original list and build a new list with only those >elements you need. Thus the maximum memory use is >the final product. > >The reason thats significant is that on most OS proceses >don't release memory back to the OS untul,after they die. >So your approach will leave your program consuming >twice the size of the list forever more (until it dies) whereas >the other approach only uses List1 + List2... > >Of course for >1) a short lived process on a PC >2) with lots of RAM and >3) a single user > >or for small size lists > >then that won't make much difference. >But if any of those conditions is not true it could be significant
Aha! See, I figured you'd tell me I was all wet. Thanks, Alan. Dick _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor