On Thu, Sep 9, 2010 at 8:44 PM, wrote:
> On Thu, Sep 9, 2010 at 11:32 PM, Keith Goodman wrote:
>> On Thu, Sep 9, 2010 at 8:07 PM, Keith Goodman wrote:
>>> On Thu, Sep 9, 2010 at 7:22 PM, cpblpublic
>>> wrote:
I am looking for some reaally basic statistical tools. I have some
sample
I was, of course, just thinking of the incremental work of inverting
the initial argsort, but you are completely correct in pointing out
that the overall complexity is O(n*log(n)) either way. As it turns
out, both approaches run in the same amount of time for my problem.
Thanks,
Alex
On Thursday,
On Thu, Sep 9, 2010 at 11:32 PM, Keith Goodman wrote:
> On Thu, Sep 9, 2010 at 8:07 PM, Keith Goodman wrote:
>> On Thu, Sep 9, 2010 at 7:22 PM, cpblpublic
>> wrote:
>>> I am looking for some reaally basic statistical tools. I have some
>>> sample data, some sample weights for those measurements
On Thu, Sep 9, 2010 at 8:07 PM, Keith Goodman wrote:
> On Thu, Sep 9, 2010 at 7:22 PM, cpblpublic wrote:
>> I am looking for some reaally basic statistical tools. I have some
>> sample data, some sample weights for those measurements, and I want to
>> calculate a mean and a standard error of the
On Thu, Sep 9, 2010 at 7:22 PM, cpblpublic wrote:
> I am looking for some reaally basic statistical tools. I have some
> sample data, some sample weights for those measurements, and I want to
> calculate a mean and a standard error of the mean.
How about using a bootstrap?
Array and weights:
>>
Hello everyone,
My numpy based image processing toolbox has just had a new release: 0.5
New features are:
Distance transform
bwperim()
freeimage interface [borrowed and improved from scikits.image]
zernike moment computation
There were some fixes to the namespace (in particular, fun
On Thu, Sep 9, 2010 at 10:22 PM, cpblpublic wrote:
>
>
> I am looking for some reaally basic statistical tools. I have some
> sample data, some sample weights for those measurements, and I want to
> calculate a mean and a standard error of the mean.
>
> Here are obvious places to look:
>
> numpy
>
Excerpts from cpblpublic's message of Thu Sep 09 22:22:05 -0400 2010:
> I am looking for some reaally basic statistical tools. I have some
> sample data, some sample weights for those measurements, and I want to
> calculate a mean and a standard error of the mean.
>
> Here are obvious places to lo
I am looking for some reaally basic statistical tools. I have some
sample data, some sample weights for those measurements, and I want to
calculate a mean and a standard error of the mean.
Here are obvious places to look:
numpy
scipy.stats
statsmodels
It seems to me that numpy's "mean" and "aver
On Thu, Sep 9, 2010 at 15:59, Alexander Michael wrote:
> Clever and concise (and expect that it works), but isn't this less
> efficient? Sorting is O(n*log(n)), while the code I gave is O(n).
> Using argsort has the potential to use less memory, though.
No, the code you gave is also O(n*log(n)) b
Clever and concise (and expect that it works), but isn't this less
efficient? Sorting is O(n*log(n)), while the code I gave is O(n).
Using argsort has the potential to use less memory, though.
On Tuesday, September 7, 2010, Zachary Pincus wrote:
>> indices = argsort(a1)
>> ranks = zeros_like(indi
Clever and concise (and expect that it works), but isn't this less
efficient? Sorting is O(n*log(n)), while the code I gave is O(n).
Using argsort has the potential to use less memory, though.
On Tuesday, September 7, 2010, Zachary Pincus wrote:
>> indices = argsort(a1)
>> ranks = zeros_like(indi
Thu, 09 Sep 2010 18:18:29 +0200, Sturla Molden wrote:
[clip]
> I hope the SciPy dev team can be persuaded to include a wrapper for
> DTRTRS in the future. It is after all extremely useful for Mahalanobis
> distances, and thus for any use of linear models in statistics.
I don't see reasons why not
> Yes, this is what I am computing. I am computing the pdf of a very high-
> dimensional multivariate normal. Is there a specialized method to compute
> this?
If you use cho_solve and cho_factor from scipy.linalg, you can proceed
like this:
cx = X - m
sqmahal = (cx*cho_solve(cho_factor(S),
Hi all,
NumPy currently makes extensive use of the DeprecationWarning
class to alert users when some feature is going to be withdrawn.
However, as of Python 2.7, the DeprecationWarning is silent by
default, see:
http://docs.python.org/library/warnings.html#updating-code-for-new-versions-of-python
On Tue, Sep 7, 2010 at 14:12, Charles Doutriaux wrote:
> Hi,
>
> I'm using distutils to build extensions written in C.
>
> I noticed that lately (it seems to be python 2.7 related) whenever I
> touch 1 C file, ALL the C files are rebuilt.
> Since I have a lot of C code, it takes a lot of time for
Original Message
Subject:[Numpy-discussion] distutils
Date: Tue, 7 Sep 2010 12:12:58 -0700
From: Charles Doutriaux
Reply-To: Discussion of Numerical Python
To: Discussion of Numerical Python
Hi,
I'm using distutils to build extensions written in C.
On Thu, Sep 9, 2010 at 05:05, Chris Ball wrote:
> Robert Kern gmail.com> writes:
>>
>> On Wed, Sep 8, 2010 at 14:42, Chris Ball gmail.com> wrote:
>> > Robert Kern gmail.com> writes:
>> >
> ...
>> a = numpy.array([1,2,3,4,5])
>> a.clip(2,None)
>> > array([2, 2, 2, 2, 2], dtype=object)
Robert Kern gmail.com> writes:
>
> On Wed, Sep 8, 2010 at 14:42, Chris Ball gmail.com> wrote:
> > Robert Kern gmail.com> writes:
> >
...
> a = numpy.array([1,2,3,4,5])
> a.clip(2,None)
> > array([2, 2, 2, 2, 2], dtype=object)
> >
> > I'm not sure why the returned array has a dtype of
19 matches
Mail list logo