Re: [Numpy-discussion] Porting numpy to Python3

2011-02-25 Thread Algis Kabaila
On Saturday 26 February 2011 02:58:19 Bruce Southey wrote: > On 02/25/2011 02:01 AM, Algis Kabaila wrote: > > I just build numpy and scipy from source so I do not know how > you get Python 3 or which Ubuntu versions include recent > numpy versions (there is a upcoming release that will > probably

Re: [Numpy-discussion] Memory error with numpy.loadtxt()

2011-02-25 Thread Chris Colbert
On Fri, Feb 25, 2011 at 12:52 PM, Joe Kington wrote: > Do you expect to have very large integer values, or only values over a > limited range? > > If your integer values will fit in into 16-bit range (or even 32-bit, if > you're on a 64-bit machine, the default dtype is float64...) you can > pote

Re: [Numpy-discussion] Memory error with numpy.loadtxt()

2011-02-25 Thread Joe Kington
Do you expect to have very large integer values, or only values over a limited range? If your integer values will fit in into 16-bit range (or even 32-bit, if you're on a 64-bit machine, the default dtype is float64...) you can potentially halve your memory usage. I.e. Something like: data = nump

[Numpy-discussion] When memory access is a bottleneck

2011-02-25 Thread Keith Goodman
A topic that often comes up on the list is that arr.sum(axis=-1) is faster than arr.sum(axis=0). For C ordered arrays, moving along the last axis moves the smallest amount in memory. And moving small amounts in memory keeps the data in cache longer. Can I use that fact to speed up calculations alon

Re: [Numpy-discussion] Porting numpy to Python3

2011-02-25 Thread Bruce Southey
On 02/25/2011 02:01 AM, Algis Kabaila wrote: > On Friday 25 February 2011 18:54:13 Scott Sinclair wrote: >> On 25 February 2011 06:22, Algis Kabaila > wrote: >>> On Friday 25 February 2011 14:44:07 Algis Kabaila wrote: >>> PS: a little investigation shows that my version of numpy >>> is 1.3.0 and

[Numpy-discussion] Memory error with numpy.loadtxt()

2011-02-25 Thread Jaidev Deshpande
Hi Is it possible to load a text file 664 MB large with integer values and about 98% sparse? numpy.loadtxt() shows a memory error. If it's not possible, what alternatives could I have? The usable RAM on my machine running Windows 7 is 3.24 GB. Thanks. ___

Re: [Numpy-discussion] Condensing array...

2011-02-25 Thread Fred
Gaël, Olivier, I finally got working it. I don't compute the nearest value but the mean. Works like a charm ;-) Thanks anyway. Cheers, -- Fred ___ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy

Re: [Numpy-discussion] Condensing array...

2011-02-25 Thread Olivier Grisel
2011/2/25 Gael Varoquaux : > On Fri, Feb 25, 2011 at 10:36:42AM +0100, Fred wrote: >> I have a big array (44 GB) I want to decimate. > >> But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of >> NaN). > >> If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for >>

Re: [Numpy-discussion] Condensing array...

2011-02-25 Thread Gael Varoquaux
On Fri, Feb 25, 2011 at 10:52:09AM +0100, Fred wrote: > > What exactly do you mean by 'decimating'. To me is seems that you are > > looking for matrix factorization or matrix completion techniques, which > > are trendy topics in machine learning currently. > By decimating, I mean this: > input arr

Re: [Numpy-discussion] Condensing array...

2011-02-25 Thread Fred
Le 25/02/2011 10:42, Gael Varoquaux a écrit : > What exactly do you mean by 'decimating'. To me is seems that you are > looking for matrix factorization or matrix completion techniques, which > are trendy topics in machine learning currently. By decimating, I mean this: input array data.shape = (

Re: [Numpy-discussion] Condensing array...

2011-02-25 Thread Gael Varoquaux
On Fri, Feb 25, 2011 at 10:36:42AM +0100, Fred wrote: > I have a big array (44 GB) I want to decimate. > But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of > NaN). > If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for > instance), the decimated array wi

[Numpy-discussion] Condensing array...

2011-02-25 Thread Fred
Hi there, I have a big array (44 GB) I want to decimate. But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of NaN). If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for instance), the decimated array will also have a lot of NaN. What I would like to have

Re: [Numpy-discussion] Problems with numpy

2011-02-25 Thread Mustapha BOUIKHIF
Pauli Virtanen wrote: > Thu, 24 Feb 2011 16:47:14 +, Pauli Virtanen wrote: >> Another possible reason is that Numpy was installed wrong (as the >> numpy.__config__ module is apparently missing). Numpy needs to be >> installed via "python setup.py install", manually copying the "numpy" >> direct

Re: [Numpy-discussion] Porting numpy to Python3

2011-02-25 Thread Algis Kabaila
On Friday 25 February 2011 18:54:13 Scott Sinclair wrote: > On 25 February 2011 06:22, Algis Kabaila wrote: > > On Friday 25 February 2011 14:44:07 Algis Kabaila wrote: > > PS: a little investigation shows that my version of numpy > > is 1.3.0 and scipy is 0.7.2 - so ubuntu binaries are way > > b