Hi Wolfgang,

  I think you are looking for reduceat( ), in particular add.reduceat()

-- srean

On Thu, May 31, 2012 at 12:36 AM, Wolfgang Kerzendorf
<wkerzend...@gmail.com> wrote:
> Dear all,
>
> I have an ndarray which consists of many arrays stacked behind each other 
> (only conceptually, in truth it's a normal 1d float64 array).
> I have a second array which tells me the start of the individual data sets in 
> the 1d float64 array and another one which tells me the length.
> Example:
>
> data_array = (conceptually) [[1,2], [1,2,3,4], [1,2,3]] = in reality 
> [1,2,1,2,3,4,1,2,3, dtype=float64]
> start_pointer = [0, 2, 6]
> length_data = [2, 4, 3]
>
> I now want to normalize each of the individual data sets. I wrote a simple 
> for loop over the start_pointer and length data grabbed the data and 
> normalized it and wrote it back to the big array. That's slow. Is there an 
> elegant numpy way to do that? Do I have to go the cython way?
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to