Gaël, Olivier,
I finally got working it. I don't compute the nearest value but the mean.
Works like a charm ;-)
Thanks anyway.
Cheers,
--
Fred
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy
2011/2/25 Gael Varoquaux :
> On Fri, Feb 25, 2011 at 10:36:42AM +0100, Fred wrote:
>> I have a big array (44 GB) I want to decimate.
>
>> But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of
>> NaN).
>
>> If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for
>>
On Fri, Feb 25, 2011 at 10:52:09AM +0100, Fred wrote:
> > What exactly do you mean by 'decimating'. To me is seems that you are
> > looking for matrix factorization or matrix completion techniques, which
> > are trendy topics in machine learning currently.
> By decimating, I mean this:
> input arr
Le 25/02/2011 10:42, Gael Varoquaux a écrit :
> What exactly do you mean by 'decimating'. To me is seems that you are
> looking for matrix factorization or matrix completion techniques, which
> are trendy topics in machine learning currently.
By decimating, I mean this:
input array data.shape = (
On Fri, Feb 25, 2011 at 10:36:42AM +0100, Fred wrote:
> I have a big array (44 GB) I want to decimate.
> But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of
> NaN).
> If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for
> instance), the decimated array wi
Hi there,
I have a big array (44 GB) I want to decimate.
But this array has a lot of NaN (only 1/3 has value, in fact, so 2/3 of
NaN).
If I "basically" decimate it (a la NumPy, ie data[::nx, ::ny, ::nz], for
instance), the decimated array will also have a lot of NaN.
What I would like to have