Hello,
The jackknife is used as a bias reduction technique, and since linear
regression estimates are unbiased I don't see why you should use it.
Rui Barradas
Em 15-05-2014 19:21, varin sacha escreveu:
Thanks Bert for your suggestion that is working.
To answer to your question, I can say th
Thanks Bert for your suggestion that is working.
To answer to your question, I can say that some econometricians say that using
bootstrap techniques on a linear regression model when the sample size N is
small, one of the most interesting purpose is on the prediction intervals which
is better f
Please note that this can (and should) be considerably sped up by
taking advantage of the fact that lm() will work on a matrix of
responses. Also, some improvement in speed can usually be obtained by
generating all samples at once rather than generating the sample each
time within a loop.
somethin
Hi Rui,
Many thanks for your response.
I have tried to adapt your code to my problem, but there is still a mistake
LinearModel.1 <- lm(GDP.per.head ~ Competitivness.score + Quality.score,
data=Dataset)
B <- 500 # number of bootstrap samples
result <- array(dim = c(22, 3, B))
for(i in 1:B)
{
Hello,
Try to follow the example below and see if you can adapt it to your
needs. Since you don't provide us with a dataset example, I start by
making up some.
# make up some data
n <- 22
set.seed(8873)
dat <- data.frame(x1 = rnorm(n), x2 = rnorm(n))
dat$y <- x1 + x2 + rnorm(n)
B <- 100 #
Dear experts,
>
>
>I have done a multiple linear regression on a small sample size (n=22).
>I have computed the prediction intervals (not the confidence intervals).
>
>
>Now I am trying to bootstrap the prediction intervals.
>
>
>I didn't find any package doing that.
>So I decide to create my own R
6 matches
Mail list logo