Anne Archibald wrote:
>
> 2009/11/29 Dr. Phillip M. Feldman :
>
>> All of the statistical packages that I am currently using and have used
>> in
>> the past (Matlab, Minitab, R, S-plus) calculate standard deviation using
>> the
>> sqrt(1/(n-1)) normalization, which gives a result that is unbia
On Sun, Dec 6, 2009 at 11:36 AM, Sturla Molden wrote:
> Colin J. Williams skrev:
>> When one has a smallish sample size, what give the best estimate of the
>> variance?
> What do you mean by "best estimate"?
>
> Unbiased? Smallest standard error?
>
>
>> In the widely used Analysis of Variance (ANO
Colin J. Williams skrev:
> When one has a smallish sample size, what give the best estimate of the
> variance?
What do you mean by "best estimate"?
Unbiased? Smallest standard error?
> In the widely used Analysis of Variance (ANOVA), the degrees of freedom
> are reduced for each mean estimate
On Sun, Dec 6, 2009 at 9:21 AM, wrote:
> On Sun, Dec 6, 2009 at 11:01 AM, Colin J. Williams wrote:
> >
>
> What's the best estimate? That's the main question
>
> Estimators differ in their (sample or posterior) distribution,
> especially bias and variance.
> Stein estimator dominates OLS in
On Sun, Dec 6, 2009 at 11:01 AM, Colin J. Williams wrote:
>
>
> On 04-Dec-09 10:54 AM, Bruce Southey wrote:
>> On 12/04/2009 06:18 AM, yogesh karpate wrote:
>>> @ Pauli and @ Colin:
>>> Sorry for the late reply. I was
>>> busy in some other assignments.
>>> # As f
On 04-Dec-09 10:54 AM, Bruce Southey wrote:
> On 12/04/2009 06:18 AM, yogesh karpate wrote:
>> @ Pauli and @ Colin:
>> Sorry for the late reply. I was
>> busy in some other assignments.
>> # As far as normalization by(n) is concerned then its common
>> assumpt
Colin J. Williams skrev:
>
> suggested that 1 (one) would be a better default but Robert Kern told
> us that it won't happen.
>
>
I don't even see the need for this keyword argument, as you can always
multiply the variance by n/(n-1) to get what you want.
Also, normalization by n gives th
On 04-Dec-09 07:18 AM, yogesh karpate wrote:
> @ Pauli and @ Colin:
> Sorry for the late reply. I was busy
> in some other assignments.
> # As far as normalization by(n) is concerned then its common
> assumption that the population is normally distributed and
On 04-Dec-09 05:21 AM, Pauli Virtanen wrote:
> pe, 2009-12-04 kello 11:19 +0100, Chris Colbert kirjoitti:
>
>> Why cant the divisor constant just be made an optional kwarg that
>> defaults to zero?
>>
> It already is an optional kwarg that defaults to zero.
>
> Cheers,
>
I suggested
This is getting OT, as I'm not making any comment on numpy's
implementation, but...
yogesh karpate wrote:
> # As far as normalization by(n) is concerned then its common assumption
> that the population is normally distributed and population size is
> fairly large enough to fit the normal dist
On 12/04/2009 06:18 AM, yogesh karpate wrote:
@ Pauli and @ Colin:
Sorry for the late reply. I was busy
in some other assignments.
# As far as normalization by(n) is concerned then its common
assumption that the population is normally distributed and populatio
@ Pauli and @ Colin:
Sorry for the late reply. I was busy in
some other assignments.
# As far as normalization by(n) is concerned then its common assumption
that the population is normally distributed and population size is fairly
large enough to fit the normal di
Thu, 03 Dec 2009 11:05:07 +0530, yogesh karpate wrote:
> The thing is that the normalization by (n-1) is done for the no. of
> samples
>>20 or23(Not sure about this no. but sure about the thing that this no
>>isnt
> greater than 25) and below that we use normalization by n. Regards
> ~ymk
> The th
pe, 2009-12-04 kello 11:19 +0100, Chris Colbert kirjoitti:
> Why cant the divisor constant just be made an optional kwarg that
> defaults to zero?
It already is an optional kwarg that defaults to zero.
Cheers,
--
Pauli Virtanen
___
NumPy-Discussion m
Why cant the divisor constant just be made an optional kwarg that
defaults to zero?
It wont break any existing code, and will let everybody that wants the
other behavior, to have it.
On Thu, Dec 3, 2009 at 1:49 PM, Colin J. Williams wrote:
> Yogesh,
>
> Could you explain the rationale for this ch
Yogesh,
Could you explain the rationale for this choice please?
Colin W.
On 03-Dec-09 00:35 AM, yogesh karpate wrote:
> The thing is that the normalization by (n-1) is done for the no. of
> samples >20 or23(Not sure about this no. but sure about the thing that
> this no isnt greater than 25) a
The thing is that the normalization by (n-1) is done for the no. of samples
>20 or23(Not sure about this no. but sure about the thing that this no isnt
greater than 25) and below that we use normalization by n.
Regards
~ymk
___
NumPy-Discussion mailing li
On Wed, Dec 2, 2009 at 13:25, Colin J. Williams wrote:
> The conventional approach, based in the notion of Expected values is
> given here:
> http://en.wikipedia.org/wiki/Variance#Distribution_of_the_sample_variance
>
> I would suggest that numpy should stick with that until the approach
> advoca
On 29-Nov-09 20:15 PM, Robin wrote:
> On Mon, Nov 30, 2009 at 12:30 AM, Colin J. Williams wrote:
>
>> On 29-Nov-09 17:13 PM, Dr. Phillip M. Feldman wrote:
>>
>>> All of the statistical packages that I am currently using and have used in
>>> the past (Matlab, Minitab, R, S-plus) calculate
Colin J. Williams skrev:
> Where the distribution of a variate is not known a priori, then I
> believe that it can be shown
> that the n-1 divisor provides the best estimate of the variance.
>
Have you ever been shooting with a rifle?
What would you rather do:
- Hit 9 or 10, with a bias to th
On 29-Nov-09, at 8:15 PM, Robin wrote:
> There have been previous discussions on this (but I can't find them
> now) and I believe the current default was chosen deliberately. I
> think it is the view of the numpy developers that the n divisor has
> more desireable properties in most cases than the
On Mon, Nov 30, 2009 at 12:30 AM, Colin J. Williams wrote:
> On 29-Nov-09 17:13 PM, Dr. Phillip M. Feldman wrote:
>> All of the statistical packages that I am currently using and have used in
>> the past (Matlab, Minitab, R, S-plus) calculate standard deviation using the
>> sqrt(1/(n-1)) normaliza
On 29-Nov-09 17:13 PM, Dr. Phillip M. Feldman wrote:
> All of the statistical packages that I am currently using and have used in
> the past (Matlab, Minitab, R, S-plus) calculate standard deviation using the
> sqrt(1/(n-1)) normalization, which gives a result that is unbiased when
> sampling fro
All of the statistical packages that I am currently using and have used in
the past (Matlab, Minitab, R, S-plus) calculate standard deviation using the
sqrt(1/(n-1)) normalization, which gives a result that is unbiased when
sampling from a normally-distributed population. NumPy uses the sqrt(1/n)
2009/11/29 Dr. Phillip M. Feldman :
> All of the statistical packages that I am currently using and have used in
> the past (Matlab, Minitab, R, S-plus) calculate standard deviation using the
> sqrt(1/(n-1)) normalization, which gives a result that is unbiased when
> sampling from a normally-distr
All of the statistical packages that I am currently using and have used in
the past (Matlab, Minitab, R, S-plus) calculate standard deviation using the
sqrt(1/(n-1)) normalization, which gives a result that is unbiased when
sampling from a normally-distributed population. NumPy uses the sqrt(1/n)
26 matches
Mail list logo