>>>>> Michael <comtech....@gmail.com> >>>>> on Mon, 12 Mar 2012 13:19:19 -0500 writes:
> The problem is: by default shouldn't it use "Huber's"? > And it should be convex problem no? > so when I do rlm(y~x) which is a single-beta fitting problem, > shouldn't it always converge? βIn theory, theory and practice are the same. In practice, they are not.β β Albert Einstein [according to http://www.goodreads.com/quotes/show/66864 ] Theory says that convergence happens in an infinite number of iterations, but then theory also says convergence means that the coefficients don't change any more ;-) [... etc.] > -------------------- > Psi functions are supplied for the Huber, Hampel and Tukey bisquare > proposals as psi.huber, psi.hampel and psi.bisquare. Huber's corresponds to > a convex optimization problem and gives a unique solution (up to > collinearity). The other two will have multiple local minima, and a good > starting point is desirable. which also mentions "up to collinearity" (theory). "Practice" would add "near-collinearity" and many other practical border line issues that can happen. As maintainer of the robustbase package, I'm slightly intrigued by your example. ==> 1) Can you provide it reproducible dput(data) "cut & paste" into your e-mail if small; available as mydata.rda after save(., file="mydata.rda") for download 2) What's the result of using lmrob() {package 'robustbase'} instead of rlm() {package 'MASS'} ? Best regards, Martin Maechler, ETH Zurich > On Fri, Mar 9, 2012 at 1:21 PM, Berend Hasselman <b...@xs4all.nl> wrote: >> >> On 09-03-2012, at 20:00, Michael wrote: >> >> > Hi all, >> > >> > In using "rlm" I've got a bunch of warnings... "failed to converge in 20 >> > steps", etc. >> > >> > My question is: >> > >> > what are the results then after the failure? >> > >> >> They haven't converged. So inaccurate. Maybe your model is badly >> formulated or ill conditioned. >> >> > Will "rlm" automatically downgrade back to "lm" upon failure? >> > >> Help says nothing about that so most likely no. >> >> Why don't you try and raise maxit? Use maxit=40 in the call of rlm. And >> see what happens. >> >> Berend >> >> >> > [[alternative HTML version deleted]] > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.