2008/2/4, Lars Friedrich <[EMAIL PROTECTED]>:
>
> Hi,
>
> > 2) Is there a way to use another algorithm (at the cost of performance)
> >> > that uses less memory during calculation so that I can generate
> bigger
> >> > histograms?
> >
> >
> > You could work through your array block by block. Simply
Hi,
> 2) Is there a way to use another algorithm (at the cost of performance)
>> > that uses less memory during calculation so that I can generate bigger
>> > histograms?
>
>
> You could work through your array block by block. Simply fix the range and
> generate an histogram for each slice of 10
Hi Lars,
[...]
2008/2/1, Lars Friedrich <[EMAIL PROTECTED]>:
>
>
> 1) How can I tell histogramdd to use another dtype than float64? My bins
> will be very little populated so an int16 should be sufficient. Without
> normalization, a Integer dtype makes more sense to me.
There is no way you'll b
Hello,
I use numpy.histogramdd to compute three dimensional histograms with a
total number of bins in the order of 1e7. It is clear to me, that such a
histogram will take a lot of memory. For a dtype=N.float64, it will take
roughly 80 megabytes. However, I have the feeling that during the
hist