On 06/22/2010 02:58 PM, josef.p...@gmail.com wrote:
> On Tue, Jun 22, 2010 at 10:09 AM, Tom Durrant wrote:
>
>>> the basic idea is in "polyfit on multiple data points" on
>>> numpy-disscusion mailing list April 2009
>>>
>>> In this case, calculations have to be done by groups
>>>
>>> subtract
On Tue, Jun 22, 2010 at 10:09 AM, Tom Durrant wrote:
>>
>> the basic idea is in "polyfit on multiple data points" on
>> numpy-disscusion mailing list April 2009
>>
>> In this case, calculations have to be done by groups
>>
>> subtract mean (this needs to be replaced by group demeaning)
>> modeldm
On 06/22/2010 09:13 AM, Tom Durrant wrote:
>
What exactly are trying to fit because it is rather bad practice
to fit
a model to some summarized data as you lose the uncertainty in the
original data?
If you define your boxes, you can loop through directly on each
box
>
>
> >
> What exactly are trying to fit because it is rather bad practice to fit
> a model to some summarized data as you lose the uncertainty in the
> original data?
> If you define your boxes, you can loop through directly on each box and
> even fit the equation:
>
> model=mu +beta1*obs
>
> The
>
>
> the basic idea is in "polyfit on multiple data points" on
> numpy-disscusion mailing list April 2009
>
> In this case, calculations have to be done by groups
>
> subtract mean (this needs to be replaced by group demeaning)
> modeldm = model - model.mean()
> obsdm = obs - obs.mean()
>
> xx =
On 06/20/2010 03:24 AM, Tom Durrant wrote:
> Hi All,
>
> I have a problem involving lat/lon data. Basically, I am evaluating
> numerical weather model data against satellite data, and trying to
> produce gridded plots of various statistics. There are various steps
> involved with this, but bas
On Sun, Jun 20, 2010 at 10:57 PM, Tom Durrant wrote:
>
>>
>> are you doing something like np.polyfit(model, obs, 1) ?
>>
>> If you are using polyfit with deg=1, i.e. fitting a straight line,
>> then this could be also calculated using the weights in histogram2d.
>>
>> histogram2d (histogramdd) use
>
> are you doing something like np.polyfit(model, obs, 1) ?
>
> If you are using polyfit with deg=1, i.e. fitting a straight line,
> then this could be also calculated using the weights in histogram2d.
>
> histogram2d (histogramdd) uses np.digitize and np.bincount, so I'm
> surprised if the hi
On Sun, Jun 20, 2010 at 4:24 AM, Tom Durrant wrote:
> Hi All,
> I have a problem involving lat/lon data. Basically, I am evaluating
> numerical weather model data against satellite data, and trying to produce
> gridded plots of various statistics. There are various steps involved with
> this, bu
Hi All,
I have a problem involving lat/lon data. Basically, I am evaluating
numerical weather model data against satellite data, and trying to produce
gridded plots of various statistics. There are various steps involved with
this, but basically, I get to the point where I have four arrays of th
On Wed, Jun 2, 2010 at 11:40 AM, Andreas Hilboll wrote:
> Hi there,
>
> I'm interested in the solution to a special case of the parallel thread
> '2D binning', which is going on at the moment. My data is on a fine global
> grid, say .125x.125 degrees. I'm looking for a way to do calculations on
>
Hello Andreas,
please see this as a side remark.
A colleague of mine made me aware of a very beautiful thing about
covering spheres by evenly spaced points:
http://healpix.jpl.nasa.gov/
Since you want to calculate mean and stddev, to my understanding a
grid in longitude/latitude is without prop
On Wed, Jun 2, 2010 at 6:18 PM, Stephen Simmons wrote:
>
> On 1/06/2010 10:51 PM, Wes McKinney wrote:
> >
> > This is a pretty good example of the "group-by" problem that will
> > hopefully work its way into a future edition of NumPy.
>
> Wes (or anyone else), please can you elaborate on any p
On 1/06/2010 10:51 PM, Wes McKinney wrote:
>
> This is a pretty good example of the "group-by" problem that will
> hopefully work its way into a future edition of NumPy.
Wes (or anyone else), please can you elaborate on any plans for groupby?
I've made my own modification to numpy.bincount f
On 6/2/2010 2:32 PM, Mathew Yeates wrote:
> Nope. This version didn't work either.
>
>
>
> If you're on Python 2.6 the binary on here might work for you:
>
> http://www.lfd.uci.edu/~gohlke/pythonlibs/
>
> It looks recent enough to have the rewritten ndimage
On 6/2/2010 2:32 PM, Mat
Nope. This version didn't work either.
>
> If you're on Python 2.6 the binary on here might work for you:
>
> http://www.lfd.uci.edu/~gohlke/pythonlibs/
>
> It looks recent enough to have the rewritten ndimage
> ___
> NumPy-Discussion mailing list
> Num
On Wed, Jun 2, 2010 at 2:26 PM, wrote:
> On Wed, Jun 2, 2010 at 2:09 PM, Mathew Yeates wrote:
>> I'm on Windows, using a precompiled binary. I never built numpy/scipy on
>> Windows.
>
> ndimage measurements has been recently rewritten. ndimage is very fast
> but (the old version) has insufficien
On Wed, Jun 2, 2010 at 2:09 PM, Mathew Yeates wrote:
> I'm on Windows, using a precompiled binary. I never built numpy/scipy on
> Windows.
ndimage measurements has been recently rewritten. ndimage is very fast
but (the old version) has insufficient type checking and may crash on
wrong inputs.
I
I'm on Windows, using a precompiled binary. I never built numpy/scipy on
Windows.
On Wed, Jun 2, 2010 at 10:45 AM, Wes McKinney wrote:
> On Wed, Jun 2, 2010 at 1:23 PM, Mathew Yeates
> wrote:
> > thanks. I am also getting an error in ndi.mean
> > Were you getting the error
> > "RuntimeError: da
On Wed, Jun 2, 2010 at 1:23 PM, Mathew Yeates wrote:
> thanks. I am also getting an error in ndi.mean
> Were you getting the error
> "RuntimeError: data type not supported"?
>
> -Mathew
>
> On Wed, Jun 2, 2010 at 9:40 AM, Wes McKinney wrote:
>>
>> On Wed, Jun 2, 2010 at 3:41 AM, Vincent Schut wr
thanks. I am also getting an error in ndi.mean
Were you getting the error
"RuntimeError: data type not supported"?
-Mathew
On Wed, Jun 2, 2010 at 9:40 AM, Wes McKinney wrote:
> On Wed, Jun 2, 2010 at 3:41 AM, Vincent Schut wrote:
> > On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
> >> On
On Wed, Jun 2, 2010 at 3:41 AM, Vincent Schut wrote:
> On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
>> On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincus
>> wrote:
I guess it's as fast as I'm going to get. I don't really see any
other way. BTW, the lat/lons are integers)
>>>
>>> You
Hi there,
I'm interested in the solution to a special case of the parallel thread
'2D binning', which is going on at the moment. My data is on a fine global
grid, say .125x.125 degrees. I'm looking for a way to do calculations on
coarser grids, e.g.
* calculate means()
* calculate std()
* ...
on
Why not simply use a set?
uniquePoints = set(zip(lats, lons))
Ben Root
On Wed, Jun 2, 2010 at 2:41 AM, Vincent Schut wrote:
> On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
> > On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincus
> wrote:
> >>> I guess it's as fast as I'm going to get. I don't
On 06/02/2010 04:52 AM, josef.p...@gmail.com wrote:
> On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincus
> wrote:
>>> I guess it's as fast as I'm going to get. I don't really see any
>>> other way. BTW, the lat/lons are integers)
>>
>> You could (in c or cython) try a brain-dead "hashtable" with no
>
On Tue, Jun 1, 2010 at 1:51 PM, Wes McKinney wrote:
> On Tue, Jun 1, 2010 at 4:49 PM, Zachary Pincus
> wrote:
>>> Hi
>>> Can anyone think of a clever (non-lopping) solution to the following?
>>>
>>> A have a list of latitudes, a list of longitudes, and list of data
>>> values. All lists are the
On Tue, Jun 1, 2010 at 9:57 PM, Zachary Pincus wrote:
>> I guess it's as fast as I'm going to get. I don't really see any
>> other way. BTW, the lat/lons are integers)
>
> You could (in c or cython) try a brain-dead "hashtable" with no
> collision detection:
>
> for lat, long, data in dataset:
>
> I guess it's as fast as I'm going to get. I don't really see any
> other way. BTW, the lat/lons are integers)
You could (in c or cython) try a brain-dead "hashtable" with no
collision detection:
for lat, long, data in dataset:
bin = (lat ^ long) % num_bins
hashtable[bin] = update_incr
I guess it's as fast as I'm going to get. I don't really see any other way.
BTW, the lat/lons are integers)
-Mathew
On Tue, Jun 1, 2010 at 1:49 PM, Zachary Pincus wrote:
> > Hi
> > Can anyone think of a clever (non-lopping) solution to the following?
> >
> > A have a list of latitudes, a list of
On Tue, Jun 1, 2010 at 4:49 PM, Zachary Pincus wrote:
>> Hi
>> Can anyone think of a clever (non-lopping) solution to the following?
>>
>> A have a list of latitudes, a list of longitudes, and list of data
>> values. All lists are the same length.
>>
>> I want to compute an average of data values
> Hi
> Can anyone think of a clever (non-lopping) solution to the following?
>
> A have a list of latitudes, a list of longitudes, and list of data
> values. All lists are the same length.
>
> I want to compute an average of data values for each lat/lon pair.
> e.g. if lat[1001] lon[1001] = la
On Tue, Jun 1, 2010 at 1:07 PM, Mathew Yeates wrote:
> Hi
> Can anyone think of a clever (non-lopping) solution to the following?
> A have a list of latitudes, a list of longitudes, and list of data values.
> All lists are the same length.
> I want to compute an average of data values for each la
Hi
Can anyone think of a clever (non-lopping) solution to the following?
A have a list of latitudes, a list of longitudes, and list of data values.
All lists are the same length.
I want to compute an average of data values for each lat/lon pair. e.g. if
lat[1001] lon[1001] = lat[2001] [lon [2001
33 matches
Mail list logo