Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Stéfan van der Walt
Hi Martin Please file a bug on the trac page: http://projects.scipy.org/scipy/numpy You may mark memory errors as blockers for the next release. Confirmed under latest SVN. Thanks Stéfan On Mon, Mar 24, 2008 at 2:05 PM, Martin Manns <[EMAIL PROTECTED]> wrote: > Hello, > > I am encountering a p

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Joris De Ridder
On 24 Mar 2008, at 18:27, Martin Manns wrote: >> I cannot confirm the problem on my intel macbook pro using the same >> Python and Numpy versions. Although any(numpy.array(large_none)) >> takes >> a significantly longer time than any(numpy.array(large_zero)), the >> former does not segfault on

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Charles R Harris
On Mon, Mar 24, 2008 at 2:00 PM, Bruce Southey <[EMAIL PROTECTED]> wrote: > Hi, > True, I noticed that on my system (with 8 Gb memory) that using > works but not 1. > Also, use of a 2 dimensional array also crashes if the size if large > enough: > large_m=numpy.vstack((large_none, large_n

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Bruce Southey
Hi, True, I noticed that on my system (with 8 Gb memory) that using works but not 1. Also, use of a 2 dimensional array also crashes if the size if large enough: large_m=numpy.vstack((large_none, large_none)) Bruce Martin Manns wrote: > Bruce Southey <[EMAIL PROTECTED]> wrote:> Hi, >

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Martin Manns
Bruce Southey <[EMAIL PROTECTED]> wrote:> Hi, > This also crashes by numpy 1.0.4 under python 2.5.1. I am guessing it > may be due to numpy.any() probably not understanding the 'None' . I doubt that because I get the segfault for all kinds of object arrays that I try out: ~$ python Python 2.4.5

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Bruce Southey
Hi, This also crashes by numpy 1.0.4 under python 2.5.1. I am guessing it may be due to numpy.any() probably not understanding the 'None' . Bruce Martin Manns wrote: >> On 24 Mar 2008, at 14:05, Martin Manns wrote: >> >> >>> Hello, >>> >>> I am encountering a problem (a bug?) with the numpy

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Martin Manns
> On 24 Mar 2008, at 14:05, Martin Manns wrote: > > > Hello, > > > > I am encountering a problem (a bug?) with the numpy any function. > > Since the python any function behaves in a slightly different way, > > I would like to keep using numpy's. > > > I cannot confirm the problem on my intel macbo

Re: [Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Joris De Ridder
I cannot confirm the problem on my intel macbook pro using the same Python and Numpy versions. Although any(numpy.array(large_none)) takes a significantly longer time than any(numpy.array(large_zero)), the former does not segfault on my machine. J. On 24 Mar 2008, at 14:05, Martin Manns

[Numpy-discussion] numpy.any segfaults for large object arrays

2008-03-24 Thread Martin Manns
Hello, I am encountering a problem (a bug?) with the numpy any function. Since the python any function behaves in a slightly different way, I would like to keep using numpy's. Here is the problem: $ python Python 2.5.1 (r251:54863, Jan 26 2008, 01:34:00) [GCC 4.1.2 (Gentoo 4.1.2)] on linux2 Typ