Hi,

I'm doing some image processing using PIL and SciPy. Individual images are 
2000x2000 pixels with each pixel being 16 bits, so a single image is around 7 
MB in size.

I've noticed that while my code is running the amount of memory being used (as 
reported by Windows Task Manager) by Python gradually increases.  It continues 
to increase if I re-run the same code again from within iPython (everytime I 
re-run the code the memory used increases by about 300 MB).  So after 
re-running the code several times (while debugging and/or developing) I will 
eventually get a memory exception (<type 'exceptions.MemoryError'>).

Does this mean I've somehow created a memory leak?  From what I've read, in  
Python 2.X creating a memory leak is a hard thing to do.  Perhaps I'm just 
doing a terrible job of coding and am constantly creating new things that take 
up memory???

My basic code structure is below.  Anyone has any suggestions on how to improve 
the efficiency of my code or to debug this problem?

Thank you in advance,
Keith

def function1(filename):
    image1=im2array(filename)  
    #im2array is a function I wrote to open a tiff image as a scipy array
    #average the 2000x2000 2d array into a single 2000x1 1d array
    # perform some scaling on the 2000x1 1d array
    #take the FFT of the 2000x1 1d array
    result=[fftpeak, fftmaxamplitude]  #output is just two float values
    return result

filenames=['file1.tif','file2.tif',...]  #about 40 image filenames in this list

results=[]
for ind in xrange(0,len(filenames):
    results.append(function1(filenames[ind]))





       
---------------------------------
Be a better friend, newshound, and know-it-all with Yahoo! Mobile.  Try it now.
_______________________________________________
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor

Reply via email to