On 16/feb/2011, at 00:04, numpy-discussion-requ...@scipy.org wrote:
>
> I'm sorry that I don't have some example code for you, but you
> probably need to break down the problem if you can't fit it into
> memory: http://en.wikipedia.org/wiki/Overlap-add_method
>
> Jonathan
Thanks! You saved my
Hi all,
I have to work with huge numpy.array (i.e. up to 250 M long) and I have to
perform either np.correlate or np.convolve between those.
The process can only work on big memory machines but it takes ages. I'm writing
to get some hint on how to speed up things (at cost of precision, maybe...)
Hi all,
Is there a fast numpy way to find the peak boundaries in a (looong, millions of
points) smoothed signal? I've found some approaches, like this:
z = data[1:-1]
l = data[:-2]
r = data[2:]
f = np.greater(z, l)
f *= np.greater(z, r)
boundaries = np.nonzero(f)
but it is too sensitive... it d