Re: [Numpy-discussion] passing arrays between processes

2009-06-14 Thread Bryan Cole
On Sun, 2009-06-14 at 15:50 -0500, Robert Kern wrote: > On Sun, Jun 14, 2009 at 14:31, Bryan Cole wrote: > > I'm starting work on an application involving cpu-intensive data > > processing using a quad-core PC. I've not worked with multi-core systems > > previousl

Re: [Numpy-discussion] passing arrays between processes

2009-06-14 Thread Bryan Cole
> In fact, I should have specified previously: I need to > deploy on MS-Win. On first glance, I can't see that mpi4py is > installable on Windows. My mistake. I see it's included in Enthon, which I'm using. Bryan > > > Bryan ___ Numpy-discussion m

Re: [Numpy-discussion] passing arrays between processes

2009-06-14 Thread Bryan Cole
> > You may want to look at MPI, e.g. mpi4py is convenient for this kind of > work. For numerical work across processes it is close to a de facto > standard. > > It requires an MPI implementation set up on your machine though (but for > single-machine use this isn't hard to set up, typically

[Numpy-discussion] passing arrays between processes

2009-06-14 Thread Bryan Cole
I'm starting work on an application involving cpu-intensive data processing using a quad-core PC. I've not worked with multi-core systems previously and I'm wondering what is the best way to utilise the hardware when working with numpy arrays. I think I'm going to use the multiprocessing package, b

[Numpy-discussion] Generalised Ufunc list

2009-05-24 Thread Bryan Cole
Which (if any) existing ufuncs support the new generalised looping system? I'm particularly interested in a "vectorised" matrix multiply. BC ___ Numpy-discussion mailing list Numpy-discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-disc

Re: [Numpy-discussion] array of matrices

2009-03-31 Thread Bryan Cole
> > I think dot will work, though you'll need to work a little bit to get the > answer: > > >>> import numpy as np > >>> a = np.array([[1,2], [3,4]], np.float) > >>> aa = np.array([a,a+1,a+2]) > >>> bb = np.array((a*5, a*6, a*7, a*8)) > >>> np.dot(aa, bb).shape > (3, 2, 4, 2) > >>> for i, a_ in

[Numpy-discussion] array of matrices

2009-03-27 Thread Bryan Cole
I have a number of arrays of shape (N,4,4). I need to perform a vectorised matrix-multiplication between pairs of them I.e. matrix-multiplication rules for the last two dimensions, usual element-wise rule for the 1st dimension (of length N). (How) is this possible with numpy? thanks, BC _

Re: [Numpy-discussion] TypeError in dtype.__eq__()

2009-01-11 Thread Bryan Cole
> > However, also note > > that with ndarray's rich comparisons, such membership testing will > > fail with ndarrays, too. > > This poses a similarly big problem. I can't understand this behaviour > either: OK, I can now. After equality testing each item, the result must be cast to bool. This is

Re: [Numpy-discussion] TypeError in dtype.__eq__()

2009-01-11 Thread Bryan Cole
> > > > What's the consensus on this? Is the current dtype behaviour broken? > > It's suboptimal, certainly. Feel free to fix it. Thankyou. > However, also note > that with ndarray's rich comparisons, such membership testing will > fail with ndarrays, too. This poses a similarly big problem.

[Numpy-discussion] TypeError in dtype.__eq__()

2009-01-11 Thread Bryan Cole
Dtype objects throw an exception if compared for equality against other objects. e.g. >>> import numpy >>> numpy.dtype('uint32')==1 Traceback (most recent call last): File "", line 1, in TypeError: data type not understood >>> After some googling, I think python wisdom (given in the Python do

Re: [Numpy-discussion] Optimization of loops

2008-10-23 Thread Bryan Cole
> >> spikes = [(0, 2.3),(1, 5.6),(3, 2.5),(0, 5.2),(3, 10.2),(2, 16.2)] > > mysort(spikes) > > should return: > > [[2.3, 5.2], [5.6], [16.2], [2.5, 10.2]] > > Intuitively, the simplest way to do that is to append elements while going > through all the tuples of the main list (called spikes)

Re: [Numpy-discussion] question about optimizing

2008-05-17 Thread Bryan Cole
> > > From the response, the answer seems to be no, and that I should stick > with the python loops for clarity. But also, the words of Anne > Archibald, makes me think that I have made a bad choice by inheriting > from ndarray, although I am not sure what a convenient alternative > would be.

Re: [Numpy-discussion] very simple iteration question.

2008-05-03 Thread Bryan Cole
On Wed, 2008-04-30 at 21:09 +0200, Gael Varoquaux wrote: > On Wed, Apr 30, 2008 at 11:57:44AM -0700, Christopher Barker wrote: > > I think I still like the idea of an iterator (or maybe making rollaxis a > > method?), but this works pretty well. > > Generally, in object oriented programming, you

Re: [Numpy-discussion] Loading a > GB file into array

2007-11-30 Thread Bryan Cole
> > Well, one thing you could do is dump your data into a PyTables_ > ``CArray`` dataset, which you may afterwards access as if its was a > NumPy array to get slices which are actually NumPy arrays. PyTables > datasets have no problem in working with datasets exceeding memory size. > For instanc

[Numpy-discussion] memory leak w/ numpy & Numeric together

2007-03-21 Thread Bryan Cole
I'm not sure where best to post this, but I get a memory leak when using code with both numpy and FFT(from Numeric) together: e.g. >>> import numpy >>> import FFT >>> def test(): ... while 1: ... data=numpy.random.random(2048) ... newdata = FFT.real_fft(data) >>> test() and m

Re: [Numpy-discussion] the neighbourhood of each element of an array

2007-02-23 Thread Bryan Cole
On Fri, 2007-02-23 at 17:38 +0100, [EMAIL PROTECTED] wrote: > Hi, > > Given a (possibly masked) 2d array x, is there a fast(er) way in Numpy to > obtain > the same result as the following few lines? > > d = 1 # neighbourhood 'radius' > Nrow = x.shape[0] > Ncol =