Number of objects grows unbouned...Memory leak

2014-05-03 Thread ptb
Hello all,

I'm using Python 3.4 and am seeing the memory usage of my program grow 
unbounded.  Here's a snippet of the loop driving the main computation

opt_dict = {'interior':cons_dict['int_eq'],'lboundary':cons_dict['lboundary'],
'rboundary':cons_dict['rboundary'],
'material_props':{'conv':0.9,'diff':0.01},
'file_ident':ident,'numeric':True,'file_set':files}

# this produces roughly 25,000 elements
args = product(zip(repeat(nx[-1]),ib_frac),nx,subs)

for i,arg in enumerate(args):
my_func(a=arg[0],b=arg[1],c=arg[2],**opt_dict)
gc.collect()
print(i,len(gc.get_objects()))

A few lines of output:

progress
0 84883
1 95842
2 106655
3 117576
4 128444
5 139309
6 150172
7 161015
8 171886
9 182739
10 193593
11 204455
12 215284
13 226102
14 236922
15 247804
16 258567
17 269386
18 280213
19 291032
20 301892
21 312701
22 323536
23 334391
24 345239
25 356076
26 366923
27 377701
28 388532
29 399321
30 410127
31 420917
32 431732
33 442489
34 453320
35 464147
36 475071
37 485593
38 496068
39 506568
40 517040
41 527531
42 538099
43 548658
44 559205
45 569732
46 580214
47 590655
48 601165
49 611656
50 622179
51 632645
52 643186
53 653654
54 664146
...

As you can see the number of objects keep growing and my memory usage grows 
proportionately.  Also, my_func doesn't return any values but simply writes 
data to a file.

I was under the impression that this sort of thing couldn't happen in Python.  
Can someone explain (1) how this is possible? and (2) how do I fix it?

Hopefully that's enough information.

Thanks for your help,
Peter
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Number of objects grows unbouned...Memory leak

2014-05-03 Thread ptb
Turns out one of the libraries I am using has a cache system.  If I shut if off 
then my problem goes away...

On Saturday, May 3, 2014 7:15:59 AM UTC-6, ptb wrote:
> Hello all,
> 
> 
> 
> I'm using Python 3.4 and am seeing the memory usage of my program grow 
> unbounded.  Here's a snippet of the loop driving the main computation
> 
> 
> 
> opt_dict = {'interior':cons_dict['int_eq'],'lboundary':cons_dict['lboundary'],
> 
> 'rboundary':cons_dict['rboundary'],
> 
> 'material_props':{'conv':0.9,'diff':0.01},
> 
> 'file_ident':ident,'numeric':True,'file_set':files}
> 
> 
> 
> # this produces roughly 25,000 elements
> 
> args = product(zip(repeat(nx[-1]),ib_frac),nx,subs)
> 
> 
> 
> for i,arg in enumerate(args):
> 
> my_func(a=arg[0],b=arg[1],c=arg[2],**opt_dict)
> 
> gc.collect()
> 
> print(i,len(gc.get_objects()))
> 
> 
> 
> A few lines of output:
> 
> 
> 
> progress
> 
> 0 84883
> 
> 1 95842
> 
> 2 106655
> 
> 3 117576
> 
> 4 128444
> 
> 5 139309
> 
> 6 150172
> 
> 7 161015
> 
> 8 171886
> 
> 9 182739
> 
> 10 193593
> 
> 11 204455
> 
> 12 215284
> 
> 13 226102
> 
> 14 236922
> 
> 15 247804
> 
> 16 258567
> 
> 17 269386
> 
> 18 280213
> 
> 19 291032
> 
> 20 301892
> 
> 21 312701
> 
> 22 323536
> 
> 23 334391
> 
> 24 345239
> 
> 25 356076
> 
> 26 366923
> 
> 27 377701
> 
> 28 388532
> 
> 29 399321
> 
> 30 410127
> 
> 31 420917
> 
> 32 431732
> 
> 33 442489
> 
> 34 453320
> 
> 35 464147
> 
> 36 475071
> 
> 37 485593
> 
> 38 496068
> 
> 39 506568
> 
> 40 517040
> 
> 41 527531
> 
> 42 538099
> 
> 43 548658
> 
> 44 559205
> 
> 45 569732
> 
> 46 580214
> 
> 47 590655
> 
> 48 601165
> 
> 49 611656
> 
> 50 622179
> 
> 51 632645
> 
> 52 643186
> 
> 53 653654
> 
> 54 664146
> 
> ...
> 
> 
> 
> As you can see the number of objects keep growing and my memory usage grows 
> proportionately.  Also, my_func doesn't return any values but simply writes 
> data to a file.
> 
> 
> 
> I was under the impression that this sort of thing couldn't happen in Python. 
>  Can someone explain (1) how this is possible? and (2) how do I fix it?
> 
> 
> 
> Hopefully that's enough information.
> 
> 
> 
> Thanks for your help,
> 
> Peter

-- 
https://mail.python.org/mailman/listinfo/python-list


C-API proper initialization and deallocation of subclasses

2014-06-12 Thread ptb
Hello all,

I decided to play around with the C-API and have gotten stuck.  I went through 
the Shoddy example 
(https://docs.python.org/3/extending/newtypes.html#subclassing-other-types) in 
the docs and tried to extend it by adding a method which creates and returns a 
shoddy instance.  I dug around to find ways to allocate and initialize my 
shoddy instance and that seems to work well.  However, I get segfaults when I 
try to delete my instance.  The code is in the gist: 
https://gist.github.com/pbrady/f2daf50761e458bbe44a

The magic happens in the make_a_shoddy function.

Here's a sample session (Python 3.4.1)

>>> from shoddy import make_a_shoddy()
>>> shd = make_a_shoddy()
tup build
shd allocated
list style allocation successful
Py_SIZE(list) : 5
Py_SIZE(shoddy) : 5
>>> type(shd)

>>> shd[:]
[1, 2, 3, 4, 5]
>>> shd.increment()
1
>>> shd.increment()
2
>>> del shd
Segmentation fault (core dumped)

This happens even if I don't set the destructor.  Any ideas on what I am doing 
wrong?

Thanks,
Peter.

 
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: C-API proper initialization and deallocation of subclasses

2014-06-13 Thread ptb
While there doesn't appear to be too much interest in this question I thought I 
would post the solution.  I had to modify shoddy by adding the proper flags and 
clear/traverse methods such to ensure that cyclic garbage collection was 
properly handled.  I'm not quite sure why I had to do this since my shoddy 
instance did not have any cyclic references but I'm wondering if it was 
necessary since it's a subclass of list which has all the cyclic GC machinery 
in place.  I suspect I won't understand the answer but if someone with more 
knowledge can clear things up, that would great.

On Thursday, June 12, 2014 10:39:27 PM UTC-6, ptb wrote:
> Hello all,
> 
> 
> 
> I decided to play around with the C-API and have gotten stuck.  I went 
> through the Shoddy example 
> (https://docs.python.org/3/extending/newtypes.html#subclassing-other-types) 
> in the docs and tried to extend it by adding a method which creates and 
> returns a shoddy instance.  I dug around to find ways to allocate and 
> initialize my shoddy instance and that seems to work well.  However, I get 
> segfaults when I try to delete my instance.  The code is in the gist: 
> https://gist.github.com/pbrady/f2daf50761e458bbe44a
> 
> 
> 
> The magic happens in the make_a_shoddy function.
> 
> 
> 
> Here's a sample session (Python 3.4.1)
> 
> 
> 
> >>> from shoddy import make_a_shoddy()
> 
> >>> shd = make_a_shoddy()
> 
> tup build
> 
> shd allocated
> 
> list style allocation successful
> 
> Py_SIZE(list) : 5
> 
> Py_SIZE(shoddy) : 5
> 
> >>> type(shd)
> 
> 
> 
> >>> shd[:]
> 
> [1, 2, 3, 4, 5]
> 
> >>> shd.increment()
> 
> 1
> 
> >>> shd.increment()
> 
> 2
> 
> >>> del shd
> 
> Segmentation fault (core dumped)
> 
> 
> 
> This happens even if I don't set the destructor.  Any ideas on what I am 
> doing wrong?
> 
> 
> 
> Thanks,
> 
> Peter.

-- 
https://mail.python.org/mailman/listinfo/python-list


[RELEASE] fastcache

2014-06-25 Thread ptb
Hello all,

I am pleased to announce the release of fastcache v0.1.  It is intended to be a 
drop in replacement for functools.lru_cache but it's written in C so it's 5-10x 
faster.  Currently Python >= 3.3 is supported.

It is available on pip via:

pip install fastcache

Or on github:

https://github.com/pbrady/fastcache.git

More details related to benchmarking and testing can also be found on the 
github page.  

Feel free to direct any issues or requests to the github page.

I look forward to your feedback!

Thanks,
Peter.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: [RELEASE] fastcache

2014-06-27 Thread ptb
The 0.2 release is out!  Python versions 2.6, 2.7, 3.2, 3.3, and 3.4 are now 
supported.

On Wednesday, June 25, 2014 12:15:02 PM UTC-6, ptb wrote:
> Hello all,
> 
> 
> 
> I am pleased to announce the release of fastcache v0.1.  It is intended to be 
> a drop in replacement for functools.lru_cache but it's written in C so it's 
> 5-10x faster.  Currently Python >= 3.3 is supported.
> 
> 
> 
> It is available on pip via:
> 
> 
> 
> pip install fastcache
> 
> 
> 
> Or on github:
> 
> 
> 
> https://github.com/pbrady/fastcache.git
> 
> 
> 
> More details related to benchmarking and testing can also be found on the 
> github page.  
> 
> 
> 
> Feel free to direct any issues or requests to the github page.
> 
> 
> 
> I look forward to your feedback!
> 
> 
> 
> Thanks,
> 
> Peter.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: JIT compilers for Python, what is the latest news?

2013-04-05 Thread ptb
Have you looked into numba? I haven't checked to see if it's python 3 
compatible.
-- 
http://mail.python.org/mailman/listinfo/python-list