Please post on the r-sig-mixed-models list, where you are more likely to
find the requisite expertise.
However, FWIW, I think the reviewer's request is complete nonsense (naïve
cross validation requires iid sampling). But the mixed models experts are
the authorities on such judgments (and may tell
see ?cv.glm under the heading "Value". The help files tell you what comes
out.
On Fri, May 28, 2010 at 10:19 PM, azam jaafari wrote:
> Hi
>
>
> Finally, I did leave-one-out cross validation in R for prediction error of
> logistic regression by cv.glm. But I don't know what are the produced
> dat
You don't seem to be making any corrections or updating your code.
There remains a syntax error in the last line of cvhfunc because of
mismatched parens.
On Jun 20, 2009, at 1:04 PM, muddz wrote:
Hi David,
Thanks and I apologize for the lack of clarity.
#n is defined as the length of xda
Hi David,
Thanks and I apologize for the lack of clarity.
##n is defined as the length of xdat
n<-length(xdat)
#I defined 'k' as the Gaussian kernel function
k<-function(v) {1/sqrt(2*pi)*exp(-v^2/2)} #GAUSSIAN kernal
#I believe ypred in my case, was the leave one out estimator (I think its
the
On Jun 19, 2009, at 7:45 PM, muddz wrote:
Hi Uwe,
My apologies.
Please if I can be guided what I am doing wrong in the code. I
started my
code as such:
# ypred is my leave one out estimator of x
Estimator of x? Really?
cvhfunc<-function(y,x,h){
ypred<-0
for (i in 1:n){
Hi Uwe,
My apologies.
Please if I can be guided what I am doing wrong in the code. I started my
code as such:
#ypred is my leave one out estimator of x
cvhfunc<-function(y,x,h){
ypred<-0
for (i in 1:n){
for (j in 1:n){
if (j!=i){
ypred<-ypred+(y[i]*k((x[j]-x[i])/h))/k((x[j]-x[i])/h)
}
}}
ypred
See the posting guide:
If you provide commented, minimal, self-contained, reproducible code
some people may be willing to help on the list.
Best,
Uwe Ligges
muddz wrote:
Hi All,
I have been trying to get this LOO-Cross Validation method to work on R for
the past 3 weeks but have had no luck
Alex Roy wrote:
Dear Frank,
Thanks for your comments. But in my situation, I do
not have any future data and I want to calculate Mean Square Error for
prediction on future data. So, is it not it a good idea to go for LOO?
thanks
Alex
With resampling you should be able t
Dear Frank,
Thanks for your comments. But in my situation, I do not
have any future data and I want to calculate Mean Square Error for
prediction on future data. So, is it not it a good idea to go for LOO?
thanks
Alex
On Tue, Feb 24, 2009 at 7:15 PM, Frank E Harrell Jr <
f.har
Hi Alex,
Give a look at:
http://search.r-project.org/cgi-bin/namazu.cgi?query=leave+one+out&max=20&result=normal&sort=score&idxname=Rhelp02a&idxname=functions&idxname=docs
Cheers
miltinho astronauta
brazil
On Tue, Feb 24, 2009 at 3:07 PM, Alex Roy wrote:
> Dear R user,
>
Alex Roy wrote:
Dear R user,
I am working with LOO. Can any one who is working
with leave one out cross validation (LOO) could send me the code?
Thanks in advance
Alex
I don't think that LOO adequately penalizes for model uncertainty. I
recommend the bootstrap or 50
Please look at ?lm.influence -- this does all the work for you.
On Thu, 10 Jan 2008, Anu Swatantran wrote:
> Hi
>
> I am trying to validate my regression results using the leave one out crosss
> validation method. Is any script available in R to use this method for a
> linear regression equation?
12 matches
Mail list logo