gh a
thousand individual elements in pure Python.
Interesting with the grow_array class. I think that what I have for now is
sufficient, but i will keep your offer in mind:)
--Slaunger
--
View this message in context:
http://numpy-discussion.10968.n7.nabble.com/Is-there-a-pure-numpy-recipe-f
Skipper, I have used memmap before, and this may
work, but still the number of elementary and operations needed (although
hidden under the hood of chunked logical_and) will be about a factor of 1000
larger than what is actually needed due to the sparsity in the "roots" of
the logical functions I act
Jaime Fernández del Río wrote
> On Wed, Mar 26, 2014 at 1:28 PM, Slaunger <
> Slaunger@
> > wrote:
>
> See if you can make sense of the following. It is a little cryptic, but it
> works:
>
> f_change = np.array([2, 3, 39, 41, 58, 59, 65, 66, 93, 102, 145])
>
jseabold wrote
> IIUC,
>
> [~/]
> [1]: np.logical_and([True, False, True], [False, False, True])
> [1]: array([False, False, True], dtype=bool)
>
> You can avoid looping over k since they're all the same length
>
> [~/]
> [3]: np.logical_and([[True, False],[False, True],[False, True]],
> [[Fals
Jaidev Deshpande wrote
> Can you provide a link to the problem itself?
>
> --
> JD
I'd rather not state the problem number since it should not be so easy to
search for it and find this thread, but I can state that at the the time
being, it is the problem with the highest problem number (released
thout
expanding it into the full arrays?
I have tried looping over each element in the changes_at arrays and build up
the sums, but that is too inefficient as I then have an inner for loop
containing conditional branching code
Thanks in advance, Slaunger
--
View this message in context:
http:
e I use
several real world examples and the performance boost is enourmous!
Whereas my previous method never got me any higher processing speeds than
413 kB/s, I am now at, hold on, 47400 kB/s checksum processing speed which is
a x 100 performance boost. Once again I see that numpy rules! No need to do