Also see fortune(254) and others about r-squared in general.

On Thu, Jul 11, 2013 at 3:35 PM, Greg Snow <538...@gmail.com> wrote:

> One way to get standardized beta coefficients is to center and scale all
> of the x variables (subtract the mean then divide by the standard
> deviation), then fit the regression on the standardized x's.  You could do
> the same thing with a robust regression (these values may not be meaningful
> if there are outliers in the x variables).
>
> You can calculate the sum of squares residuals by squaring the residuals
> (difference between observed and predicted values) and summing them up.
>  You can calculate the sum of squares regression by summing the squares of
> the distances between the predicted values and a measure of center, the
> measure of center for OLS is just the mean, for a robust regression you may
> want to use the estimated intercept when fitting an intercept only model
> using the same robust function.  You can then find the sum of squares total
> by summing the squared differences between the observed values and the
> measure of center (if you do not use the mean then don't expect  that SSE +
> SSR = SST).  Using these values (and degrees of freedom) you can compute
> things that you could call R-squared, adjusted R-squared and an F ratio.
>  Though in the robust  model I would be very surprised if the F-ratio (even
> if you can figure out the correct degrees of freedom) followed an F
> distribution or non-central F distribution, and the r-squared values are
> probably even less meaningful than they are for OLS (and since some argue
> that R-squared for OLS is pretty meaningless to begin with, that is saying
> something).
>
> It would be better to decide what question(s) you are really trying to
> answer, then find the method that will answer the question(s), possibly
> using a bootstrap or permutation approach, or something more appropriate
> rather than trying to force a square peg into a hole that you have not even
> checked to see if it is round, square, or something else.
>
>
> On Tue, Jul 9, 2013 at 2:20 AM, D. Alain <dialva...@yahoo.de> wrote:
>
>> Dear R-List,
>>
>> due to outliers in my data I wanted to carry out a robust regression.
>> According to APA standards, reporting OLS regression results should
>> include
>>
>> 1. unstandardized beta coefficients
>> 2. standardized beta coefficients
>> 3. SE
>> 4. t values
>> 5. r squared
>> 6. r squared adjusted
>> 7. F (df.num/df.den)
>>
>> Now I tried the robust version using lmrob (package="robustbase")
>>
>> lmrob.fit<-lmrob(y~x1+x2+x3,data=mydat)
>>
>> I got
>> 1. unstandardized beta coef
>> 3. SE
>> 4. t values
>>
>> What about?
>> 2. standardized beta coef
>> 5. r squared
>> 6. r squared adjusted
>> 7. F (df.num/df.den)
>>
>> I have read in an R-threat (
>> http://tolstoy.newcastle.edu.au/R/e5/help/08/11/7271.html) that R2 is
>> only valid in the context of least-square methods. Is there no equivalent I
>> could report for non-least-square methods? Then why does lmrob-output not
>> include standardized beta coefs and F statistic? How can I compute both of
>> them?
>>
>> Then I realized that ltsReg (package="robustbase") does actually report
>> almost everything I would need, but I could not find "standardized beta
>> coefficients" (does anyone know how I could compute these coefs?)
>> Though, the authors of the package "strongly recommend using lmrob()
>> instead of ltsReg". Is this due to inefficiency or are the coefs biased?
>>
>> Finally I found lmRob (package="robust") which does report at least a
>> multiple R2, but which is apparently biased and needs correction as I found
>> in a threat of Renaud & Victoria-Feser
>> https://stat.ethz.ch/pipermail/r-sig-robust/2010/000290.html
>>
>> where the authors recommend to correct R2 for bias (Renaud, O. &
>> Victoria-Feser, M.-P. (2010). A robust coefficient of determination for
>> regression. Journal of Statistical Planning and Inference, 140, 1852-1862.
>> http://dx.doi.org/10.1016/j.jspi.2010.01.008). Does that mean, that
>> "multiple r squared" can be reported even though it is not a least square
>> method, but should be corrected for bias? Then, what does that mean for the
>> rest of lmRob-output (e.g. t-values)?
>>
>> I must confess that I am somewhat confused and I would be very thankful
>> for any clarification in this matter.
>> Thank you in advance and sorry for my question if it reveals some serious
>> lack of knowledge on my side.
>>
>> Best wishes.
>>
>> Alain
>>
>>         [[alternative HTML version deleted]]
>>
>>
>> ______________________________________________
>> R-help@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
>
> --
> Gregory (Greg) L. Snow Ph.D.
> 538...@gmail.com
>



-- 
Gregory (Greg) L. Snow Ph.D.
538...@gmail.com

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to