Hello,
I'm studying bootstrapping.
I devised a code that draw a sample of heights, and then compute the
sample mean of those heights.
I want to compute the variance of the sample mean by bootstrapping.
I am comparing this with the "real" variance of the sample meanand with
an "estimated" vari
Hello~
I'm fitting a random effects model using coxme and I was wondering how are
the variance components estimated? Are the fixed and random effects
estimated iteratively using Fisher scoring method?
I referred to the coxme manual but it didn't specify how the parameters are
estimated
(https://c
I think your imprecise use of statistical methods is getting you into trouble.
A literal interpretation of your question would lead to var(my.data$fluo), but
whether that number would be meaningful would depend on what you did with it (I
doubt much good would come from using it directly). Unfort
Dear all,
how can I calculate the global variance of repeated measurements? can
I simply use the var() function or shall i use more sophisticated
tools such as aov()? and in the latter case, how can i extract the
variance value?
I am providing an example.
Thank you.
best regards
luigi
>>>
samp <-
Hi all,
I am trying to calculate the variance-covariance matrix for parameter Beta
under the null (Ho) using the "prop.odds" function in the timereg package.
For the Cox PH model, I used the "vcov" function and did the following:
cox <- coxph(Surv(time, censor) ~ x, iter = 0, init = 0, d
Hi all,
I am trying to calculate the variance-covariance matrix for parameter Beta
under the null (Ho) using the "prop.odds" function in the timereg package.
In other words, I am looking for Var(Beta under the null).
For the Cox PH model, I used the "vcov" function and did the following:
The difference is that survreg is using a maximum likelihood estimate (MLE) of the
variance and that lm is using the unbiased (MVUE) estimate of variance. For simple linear
regression, the former divides by "n" and the latter by "n-p". The difference in your
variances is exactly n/(n-p) = 10/8
I would like help understanding why a survival regression with no censored
data-points does not give the same variance estimates as a linear model
(see code below).
I think it must be something to do with the fact that the variance is an
actual parameter in the survival version via the log(scale),
ll open a
> separate thread
> in the case.
>
> Thanks.
>
> ---
>
> Giorgio
>
> Genoa, Italy
>
> From: Tsjerk Wassenaar [mailto:tsje...@gmail.com]
> Sent: domenica 10 maggio 2015 22:31
> To: Giorgio Garziano
> Cc: r-help@r-project.org
> Subject: Re: [R] Va
: Re: [R] Variance-covariance matrix
Hi Giorgio,
This is for a multivariate time series. x1 is variable 1 of the observation
vector x, x2, variable 2, etc. If you need x(i) and x(i+1), etc, then you're
looking for the autocovariance/autocorrelation matrix, which is a quite
different thing
nce: “Time series and its applications – with R examples”,
> Springer,
>
> $7.8 “Principal Components” pag. 468, 469
>
>
>
> Cheers,
>
>
>
> Giorgio
>
>
>
>
>
> *From:* Tsjerk Wassenaar [mailto:tsje...@gmail.com]
> *Sent:* domenica 10 mag
-project.org
Subject: Re: [R] Variance-covariance matrix
Hi Giorgio,
For a univariate time series? Seriously?
data <- rnorm(10,2,1)
as.matrix(var(data))
Cheers,
Tsjerk
On Sun, May 10, 2015 at 9:54 PM, Giorgio Garziano
mailto:giorgio.garzi...@ericsson.com>> wrote:
Hi,
Actually as
ata.center)
>
> --
> Giorgio Garziano
>
>
> -Original Message-
> From: David Winsemius [mailto:dwinsem...@comcast.net]
> Sent: domenica 10 maggio 2015 21:27
> To: Giorgio Garziano
> Cc: r-help@r-project.org
> Subject: Re: [R] Variance-covariance matrix
>
&g
lt;- (1/(n-1)) * data.center %*% t(data.center)
--
Giorgio Garziano
-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net]
Sent: domenica 10 maggio 2015 21:27
To: Giorgio Garziano
Cc: r-help@r-project.org
Subject: Re: [R] Variance-covariance matrix
On May 10, 2015, at 4:27
On May 10, 2015, at 4:27 AM, Giorgio Garziano wrote:
> Hi,
>
> I am looking for a R package providing with variance-covariance matrix
> computation of univariate time series.
>
> Please, any suggestions ?
If you mean the auto-correlation function, then the stats package (loaded by
default at
Hi,
I am looking for a R package providing with variance-covariance matrix
computation of univariate time series.
Please, any suggestions ?
Regards,
Giorgio
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list -- To UN
exas A&M University
> > College Station, TX 77840-4352
> >
> >
> > -Original Message-
> > From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of Karl Fetter
> > Sent: Monday, February 9, 2015 3:33 PM
> > To: r-help@r-project.org
>
Karl Fetter
> Sent: Monday, February 9, 2015 3:33 PM
> To: r-help@r-project.org
> Subject: [R] Variance is different in R vs. Excel?
>
> Hello everyone, I have a simple question. when I use the var() function in
> R to find a variance, it differs greatly from the variance found i
A&M University
College Station, TX 77840-4352
-Original Message-
From: R-help [mailto:r-help-boun...@r-project.org] On Behalf Of Karl Fetter
Sent: Monday, February 9, 2015 3:33 PM
To: r-help@r-project.org
Subject: [R] Variance is different in R vs. Excel?
Hello everyone, I have a simple q
Hello everyone, I have a simple question. when I use the var() function in
R to find a variance, it differs greatly from the variance found in excel
using the =VAR.S function. Any explanations on what those two functions are
actually doing?
Here is the data and the results:
dat<-matrix(c(402,908,
>>>> On 04/11/14 16:13, PIKAL Petr wrote:
>>>>>> Hi
>>>>>>
>>>>>>> -Original Message-
>>>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>>>> project.org] On
Hi
>>>>>
>>>>>> -Original Message-
>>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>>> project.org] On Behalf Of CJ Davies
>>>>>> Sent: Tuesday, November 04, 2014 2:50 PM
>>>>
-
>>>>> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
>>>>> project.org] On Behalf Of CJ Davies
>>>>> Sent: Tuesday, November 04, 2014 2:50 PM
>>>>> To: Jim Lemon; r-help@r-project.org
>>>>> Subject: Re:
PM
To: Jim Lemon; r-help@r-project.org
Subject: Re: [R] Variance of multiple non-contiguous time periods?
On 04/11/14 09:11, Jim Lemon wrote:
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
...
On 30/10/14 21:33, Jim Lemon wrote:
If I understand, you mean to calculate deviations for each
day, November 04, 2014 2:50 PM
>>> To: Jim Lemon; r-help@r-project.org
>>> Subject: Re: [R] Variance of multiple non-contiguous time periods?
>>>
>>> On 04/11/14 09:11, Jim Lemon wrote:
>>>> On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
>>>
On 04/11/14 16:13, PIKAL Petr wrote:
Hi
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
project.org] On Behalf Of CJ Davies
Sent: Tuesday, November 04, 2014 2:50 PM
To: Jim Lemon; r-help@r-project.org
Subject: Re: [R] Variance of multiple non-contiguous
Hi
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-bounces@r-
> project.org] On Behalf Of CJ Davies
> Sent: Tuesday, November 04, 2014 2:50 PM
> To: Jim Lemon; r-help@r-project.org
> Subject: Re: [R] Variance of multiple non-contiguous time per
On 04/11/14 09:11, Jim Lemon wrote:
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
...
On 30/10/14 21:33, Jim Lemon wrote:
If I understand, you mean to calculate deviations for each individual
'chunk' of each transition & then aggregate the results? This is what
I'd been thinking about, but is
On Mon, 3 Nov 2014 12:45:03 PM CJ Davies wrote:
> ...
> On 30/10/14 21:33, Jim Lemon wrote:
> If I understand, you mean to calculate deviations for each individual
> 'chunk' of each transition & then aggregate the results? This is what
> I'd been thinking about, but is there a sensible manner withi
On 30/10/14 21:33, Jim Lemon wrote:
> On Fri, 31 Oct 2014 07:19:01 AM Jim Lemon wrote:
>> On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
>>> I am trying to show that the red line ('yaw') in the upper of the two
>>> plots here;
>>>
>>> http://i.imgur.com/N4Xxb4f.png
>>>
>>> varies more within the
On Fri, 31 Oct 2014 07:19:01 AM Jim Lemon wrote:
> On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
> > I am trying to show that the red line ('yaw') in the upper of the two
> > plots here;
> >
> > http://i.imgur.com/N4Xxb4f.png
> >
> > varies more within the pink sections ('transition 1') than i
On Wed, 29 Oct 2014 05:12:19 PM CJ Davies wrote:
> I am trying to show that the red line ('yaw') in the upper of the two
> plots here;
>
> http://i.imgur.com/N4Xxb4f.png
>
> varies more within the pink sections ('transition 1') than in the light
> blue sections ('real').
>
> I tried to use var.t
I am trying to show that the red line ('yaw') in the upper of the two
plots here;
http://i.imgur.com/N4Xxb4f.png
varies more within the pink sections ('transition 1') than in the light
blue sections ('real').
I tried to use var.test() however this runs into a problem because
although the re
Dear contributors,
I know that this has been widely discussed, but even after having read many
discussions on this matter, I'm still not sure if I'm understanding
properly.
So I have a dataset of studies reporting prevalence in several settings,
here is an exemple:
data<-data.frame(id_study=c("U
Dear R-helpers...
I've be trying to run a variance analysis to compare means between various
lines in various treatments.
I have 10 genotypes (GEN), tested in 2 environments (ENV) and in each
environment there are 3 repetitions (REP). Several traits were recoded (yield,
flowering, plant height.
Hi all,
I've been attempting to fit a logistic glmm using glmmPQL in order to
estimate variance components for a score test, where the model is of the
form logit(mu) = X*a+ Z1*b1 + Z2*b2. Z1 and Z2 are actually reduced rank
square root matrices of the assumed covariance structure (up to a constan
Hi,
I have simulated one possible path of a variance gamma process by the
following code:
vektor<-c(1:23)
S0=20
theta=0.01
v=5
sigma=0.1
vektor[1]<-S0
for (i in 2:23){
randomgamma<-rgamma(1, shape=1/v, scale = v)
randomnormal<-rnorm(1,mean=0,sd=1)
vektor[i]<-vektor[i-1]+theta*randomgamma+sigma*
You've stumbled across the answer to your question --
while lm() supports y~X formulas without a data=argument
and y~ X1+X2+X3 formulas with one, you can't depend on
all contributed functions to do the same.
As John pointed out, the advantage of car::vif over other
implementations is that it cor
Dear Martin,
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Behalf Of Martin H. Schmidt
> Sent: Thursday, September 20, 2012 8:52 AM
> To: r-help@r-project.org
> Subject: [R] Variance Inflation Factor VIC() wit
Hi everyone,
Running the vif() function from the car package like
> reg2 <- lm(CARsPur~Delay_max10+LawChange+MarketTrend_20d+MultiTrade,
data=data.frame(VarVecPur))
> vif(reg2)
Delay_max10 LawChange MarketTrend_20d MultiTrade
Hi All,
I am analyzing a set of data collected by two-stage cluster sampling. My
model is
y_ij = mu + T_i + e_ij
where T_i is the ith treatment and e_ij is random error for the ijth
individual. I have MSE_within and MSE_between, which lead to MSE_T for the
model.
Suppose I have balanced data whe
-Original Message-
> From: R. Michael Weylandt [mailto:michael.weyla...@gmail.com]
> Sent: Wednesday, July 11, 2012 4:04 PM
> To: Hui Du
> Cc: Jorge I Velez; R-help
> Subject: Re: [R] Variance Inflation factor
>
> You're rather out of date with your version of R -- if y
ginal Message-
From: R. Michael Weylandt [mailto:michael.weyla...@gmail.com]
Sent: Wednesday, July 11, 2012 4:04 PM
To: Hui Du
Cc: Jorge I Velez; R-help
Subject: Re: [R] Variance Inflation factor
You're rather out of date with your version of R -- if you want to use
the CRAN binaries p
n_US.UTF-8 LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats graphics grDevices utils datasets methods base
>
> HXD
>
> From: Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> Sent: Wednesday, July 11, 2012 3:31 PM
> To: Hui Du
> Cc: R-help
> Subject: Re:
vailable
> >
> > ** **
> >
> > HXD
> >
> > ** **
> >
> > *From:* Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> > *Sent:* Wednesday, July 11, 2012 3:19 PM
> > *To:* Hui Du
> > *Cc:* R-help
> > *Subject:* Re:
PM
To: Hui Du
Cc: R-help
Subject: Re: [R] Variance Inflation factor
Could you please include your sessionInfo() ?
Thank you,
Jorge.-
On Wed, Jul 11, 2012 at 6:27 PM, Hui Du
mailto:hui...@dataventures.com>> wrote:
Thanks. But in UNIX side, I got the same error
In getDependencies(pkgs, dep
râ is not available
>
> ** **
>
> HXD
>
> ** **
>
> *From:* Jorge I Velez [mailto:jorgeivanve...@gmail.com]
> *Sent:* Wednesday, July 11, 2012 3:19 PM
> *To:* Hui Du
> *Cc:* R-help
> *Subject:* Re: [R] Variance Inflation factor
>
> ** **
>
> S
Thanks. But in UNIX side, I got the same error
In getDependencies(pkgs, dependencies, available, lib) :
package ââ¬Ëcarââ¬â¢ is not available
HXD
From: Jorge I Velez [mailto:jorgeivanve...@gmail.com]
Sent: Wednesday, July 11, 2012 3:19 PM
To: Hui Du
Cc: R-help
Subject: Re: [R] Variance
See the examples at
# install.pacages('car')
require(car)
?vif
HTH,
Jorge.-
On Wed, Jul 11, 2012 at 6:10 PM, Hui Du <> wrote:
> Hi All,
>
>
> I need to calculate VIF (variance inflation factor) for my linear
> regression model. I found there was a function named vif in 'HH' package.
> I have
Hi All,
I need to calculate VIF (variance inflation factor) for my linear regression
model. I found there was a function named vif in 'HH' package. I have two
questions:
1) I was able to install that package in my R under windows. But while
trying to install that package in UNIX, I got
On Fri, Jun 22, 2012 at 5:13 AM, Mohan Radhakrishnan wrote:
> Hi,
>
>
>
> Is there a way to calculate variance directly by specifying
> confidence interval using R ? I am specifically asking because I wanted
> to investigate how this could be useful for project schedule variance
> calculation
Hi,
Is there a way to calculate variance directly by specifying
confidence interval using R ? I am specifically asking because I wanted
to investigate how this could be useful for project schedule variance
calculation.
Moreover I am interested in using R for monte carlo simulation as
On 22 Feb 2012, at 14:01, Terry Therneau wrote:
> --- begin included message ---
> I have a left truncated, right censored cox model:
>
> coxph(Surv(start, stop, censor) ~ x + y, mydata)
>
> I would like to know how much of the observed variance (as a number
> between 0 and 1) is explained by ea
--- begin included message ---
I have a left truncated, right censored cox model:
coxph(Surv(start, stop, censor) ~ x + y, mydata)
I would like to know how much of the observed variance (as a number
between 0 and 1) is explained by each variable. How could I do that?
Adding terms sequentially an
Hi All,
I have a left truncated, right censored cox model:
coxph(Surv(start, stop, censor) ~ x + y, mydata)
I would like to know how much of the observed variance (as a number between 0
and 1) is explained by each variable. How could I do that?
Adding terms sequentially and then using anova(
Dear Prof. Wood,
I read your methods of extracting the variance explained by each
predictor in different places. My question is: using the method you
suggested, the sum of the deviance explained by all terms is not equal to
the deviance explained by the full model. Could you tell me what caused
Hi,
Searching on http://www.rseek.org for "variance ratio test" turns up the
vrtest package, as does searching for Lo and Mackinlay,
suggesting that's a good place to start.
Sarah
On Wed, Oct 5, 2011 at 2:48 PM, rauf ibrahim wrote:
> Hello,
> I am looking for a code in R for the variance ratio
Hello,
I am looking for a code in R for the variance ratio test statistic (the
Lo and Mackinlay version or any other versions).
Does anybody have such a code they can share or know a library in which
I can find this function?
Basically I have a number of time series which I need to check for
per
Oh silly me--and I've been staring at that for a good hour. Thank you and
I'll keep your advice in mind.
On Thu, Apr 28, 2011 at 6:24 PM, Andrew Robinson <
a.robin...@ms.unimelb.edu.au> wrote:
> A couple of points here
>
> First, note that q doesn't increment in the code below. So, you're
>
A couple of points here
First, note that q doesn't increment in the code below. So, you're
getting the same variance each time.
Second, note that (t$Rec1==input3 & t$Rec2==input4) evaluates to F?T
or 0/1, and it's not clear from your code if that is what you intend.
Finally, it's much easi
I'm trying to find the variance of various outputs in a matrix:
for(l in 2:vl){
for(o in 1:(l-1)){
# Make sure the inputs are for the matrix "m"
input3=rownames(v)[o]
input4=colnames(v)[l]
r=t[(t$Rec1==input3 & t$Rec2==input4),output]
if(length(r)==0){
r=t[(t$Rec1==i
Hi:
I didn't see anything on first blush from the mod1 or summary(mod1) objects,
but it's not too hard to compute:
> names(mod1)
[1] "coefficients" "icoef" "var"
[4] "var2" "loglik""iter"
[7] "linear.predictors" "frail" "fvar"
[10] "df"
I have the following questions about the variance of the random effects in the
survreg() function in the survival package:
1) How can I extract the variance of the random effects after fitting a model?
For example:
set.seed(1007)
x <- runif(100)
m <- rnorm(10, mean = 1, sd =2)
mu <- rep(m, rep(
Picking up an ancient thread (from Oct 2007), I have a somewhat more complex
problem than given in Simon Wood's example below. My full model has more than
two smooths as well as factor variables as in this simplified example:
b <- gam(y~fv1+s(x1)+s(x2)+s(x3))
Judging from Simon's example, my g
hi everybody,
i know it a quite complicate subject but someone might have the
solution.
I am doing a delta model coupling a binomial glm and a lognormal one.
Using the Laurent correction I can predict mean values and I would
like to know if you know how to predict the variance?
Do you know the t
Dear all
I run a regression model with three predictors. When I try the Variance
Inflation Factors command from Rcmdr menue, I get the message
vif(LinearModel.4)
ERROR: attempt to set an attribute on NULL
and get no results. I know that there is high multicolinearity, but why does it
not work
Hello all and thanks in advance for any advice.
I would like to calculate the variance inflation factor for a linear model
(lm) with 4 explanatory variables. I would then like to use this to
calculate QAIC. I have used the function vif() in the car package and I get
values for each variable howe
Fantastic! it's solved! Thank you very much Bill!
Barbara
--- On Wed, 7/28/10, bill.venab...@csiro.au wrote:
> From: bill.venab...@csiro.au
> Subject: RE: [R] Variance-covariance matrix from GLM
> To: bojuanz...@yahoo.com, r-help@r-project.org
> Date: Wednesday, July
?vcov ### now in the stats package
You would use
V <- vcov(my.glm)
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of Bojuan Zhao
Sent: Thursday, 29 July 2010 9:52 AM
To: r-help@r-project.org
Subject: [R] Variance-covaria
Hello,
Is there a way to obtain the variance-covariance matrix of the estimated
parameters from GLM?
my.glm<-glm(mat ~X,family = binomial, data =myDATA)
out1<-predict(my.glm,se.fit = TRUE)
std<-out1$se.fit
se.fit is for getting the standard errors of the estimated parameters (\betas).
Is the
Thank you for response.
For question 2,
Since I need to know the expectation of Y for new observations, let's say
X*.
So I need to know the expectation and also the variance of log (Y|X*).
I know 'fitted(lin)' will give me the E[log(Y|X*)]. But I do not know how to
get var[log(Y|X*)] or say sd[
Hi:
On Wed, Jul 21, 2010 at 2:29 PM, Yi wrote:
> Hi, folks,
>
> Here are the codes:
>
> ##
> y=1:10
> x=c(1:9,1)
> lin=lm(log(y)~x) ### log(y) is following Normal distribution
> x=5:14
> prediction=predict(lin,newdata=x) ##prediction=predict(lin)
> ###
>
predict() need
Sorry, for the second question. I stated in a wrong way. My aim is the mean
and sd of each new observation.
#
mean=fitted(prediction)
##
But I do not know how to get sd for each new observation.
Any tips?
Thanks
Yi
On Wed, Jul 21, 2010 at 2:29 PM, Yi wrote:
> Hi, folks,
>
> Here a
Hi, folks,
Here are the codes:
##
y=1:10
x=c(1:9,1)
lin=lm(log(y)~x) ### log(y) is following Normal distribution
x=5:14
prediction=predict(lin,newdata=x) ##prediction=predict(lin)
###
1. The codes do not work, and give the error message: Error in
eval(predvars, data, en
How does mlest generate the estimate for sigmahat, the variance-covariance
matrix? It produces different values than when using cov(data.frame).
--
View this message in context:
http://r.789695.n4.nabble.com/variance-covariance-matrix-of-mlest-in-library-mvnmle-tp2232127p2232127.html
Sent from
I thought that function var() and first element of function acf(object,
type=c('covariance')) should give me the same results. But they differ. Can
someone share an explanation.
--
View this message in context:
http://r.789695.n4.nabble.com/Variance-vs-covariance-tp2173501p2173501.html
Sent fro
On Mon, Mar 8, 2010 at 3:44 PM, casperyc wrote:
>
> Hi Rolf Turner ,
>
> God, it directed to the wrong page.
>
> I firstly find the formula in wiki, than tried to verify the answer in R,
> now, given that 143/12 ((n^2-1)/12 ) is the correct answer for a discrete
> uniform random variable,
> I am s
Hi Rolf Turner ,
God, it directed to the wrong page.
I firstly find the formula in wiki, than tried to verify the answer in R,
now, given that 143/12 ((n^2-1)/12 ) is the correct answer for a discrete
uniform random variable,
I am still not sure what R is calculating there?
why it gives me 13?
On 9/03/2010, at 12:13 PM, casperyc wrote:
>
> Hi all,
>
> I am REALLY confused with the variance right now.
You need to learn the difference
(a) Between sample variance (*estimate* of population variance)
and
population variance.
and
Hi all,
I am REALLY confused with the variance right now.
for a discrete uniform distribution on [1,12]
the mean is (1+12)/2=6.5
which is ok.
y=1:12
mean(y)
then var(y)
gives me 13
1- on http://en.wikipedia.org/wiki/Uniform_distribution_%28discrete%29 wiki
the variance is (12^2-1)/12=
Dear R-help suscribers,
I am doing a meta-analysis of sea urchin growth data in R. I am
fitting a non-linear growth function using nlme(). Most of my
observations are means, and I want to give them weights according to
the number of individuals that were used to obtain those means (as I
do not h
hello all -
i was searching for theoretical articles/vector equations regarding variance
inflation factor (or generalization) for the linear mixed effects model
(repeated measures data)
sincerely,
tom
--
View this message in context:
http://n4.nabble.com/variance-inflation-factor-for-linear-mixe
amira akl wrote:
Hello
I am a new user of R software. I benefit from using vrtest-package. However, the codes provided by the aforementioned package, for example, calculate the test statistics for Lo and Mackinlay (1988) under the assumptions of homoscedasticity and heteroscedasticity without c
Hello
I am a new user of R software. I benefit from using vrtest-package. However,
the codes provided by the aforementioned package, for example, calculate the
test statistics for Lo and Mackinlay (1988) under the assumptions of
homoscedasticity and heteroscedasticity without computing the value
Simon,That produced exactly what I was looking for. Thanks so much for the
humble help.
KC
On Mon, Jul 13, 2009 at 9:10 AM, Simon Wood wrote:
> You can get some idea by doing something like the following, which compares
> the r^2 for models b and b2, i.e. with and without s(x2). It keeps the
It appears you are conflating beta coefficients (individual covariate
effect measures) with overall model fit measures. Beta coefficients
are not directly comparable to R-squared measures in ordinary least
squares analyses, so why would they be so in gam models?
I cannot tell whether you ac
You can get some idea by doing something like the following, which compares
the r^2 for models b and b2, i.e. with and without s(x2). It keeps the
smoothing parameters fixed for the comparison. (s(x,fx=TRUE) removes
penalization altogether btw, which is not what was wanted).
dat <- gamSim(1,n
Many thanks for the advice David. I would really like to figure out, though,
how to get the contribution of each factor to the Rsq - something like a
Beta coefficient for GAM. Ideas?
KC
On Sun, Jul 12, 2009 at 5:41 PM, David Winsemius wrote:
>
> On Jul 12, 2009, at 5:06 PM, Kayce Anderson wrote
On Jul 12, 2009, at 5:06 PM, Kayce Anderson wrote:
Hi,
I am using mgcv:gam and have developed a model with 5 smoothed
predictors
and one factor.
gam1 <- gam(log.sp~ s(Spr.precip,bs="ts") + s(Win.precip,bs="ts") +
s(
Spr.Tmin,bs="ts") + s(P.sum.Tmin,bs="ts") + s( Win.Tmax,bs="ts")
+fact
Hi,
I am using mgcv:gam and have developed a model with 5 smoothed predictors
and one factor.
gam1 <- gam(log.sp~ s(Spr.precip,bs="ts") + s(Win.precip,bs="ts") + s(
Spr.Tmin,bs="ts") + s(P.sum.Tmin,bs="ts") + s( Win.Tmax,bs="ts")
+factor(site),data=dat3)
The total deviance explained = 70.4%.
On Tue, Jun 2, 2009 at 3:34 PM, Thomas Lumley wrote:
> The answers differ by a factor of 19/20, ie, (n-1)/n, so it is presumably
> the choice of denominator for the variance that differs.
>
Same issue is present in ccf():
cov() != ccf(lag.max=0, type="covariance").
Liviu
The answers differ by a factor of 19/20, ie, (n-1)/n, so it is presumably
the choice of denominator for the variance that differs.
-thomas
On Tue, 2 Jun 2009, Liviu Andronic wrote:
Dear all,
Does this make any sense:
var() = cov() != acf(lag.max=0, type="covariance")?
I have daily
Dear all,
Does this make any sense:
var() = cov() != acf(lag.max=0, type="covariance")?
I have daily data of IBM for May 2005, and I'm using the logarithmic return:
> ibm200505$LRAdj.Close
[1] NA 0.0203152 0.0005508 -0.0148397 -0.0025182 0.0092025
-0.0013889
[8] 0.0098196 -0.0103757
(this post suggests a patch to the sources, so i allow myself to divert
it to r-devel)
Bert Gunter wrote:
> x a numeric vector, matrix or data frame.
> y NULL (default) or a vector, matrix or data frame with compatible
> dimensions to x. The default is equivalent to y = x (but more efficient).
-project.org
Subject: Re: [R] variance/mean
rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]0
Wacek Kusnierczyk wrote:
>
> when you apply var to a single matrix, it will compute covariances
> between its columns rather than the overall variance:
>
> set.seed(0)
> x = matrix(rnorm(4), 2, 2)
>
> var(x)
> #[,1] [,2]
> # [1,] 1.2629543 1.329799
>
rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]000
>
> and
>
> mean(o)
> [1] 2
>
> How do I get mean to return an ar
On 22-Mar-09 08:17:29, rkevinbur...@charter.net wrote:
> At the risk of appearing ignorant why is the folowing true?
>
> o <- cbind(rep(1,3),rep(2,3),rep(3,3))
> var(o)
> [,1] [,2] [,3]
> [1,]000
> [2,]000
> [3,]000
>
> and
>
> mean(o)
> [1] 2
>
> How do
At the risk of appearing ignorant why is the folowing true?
o <- cbind(rep(1,3),rep(2,3),rep(3,3))
var(o)
[,1] [,2] [,3]
[1,]000
[2,]000
[3,]000
and
mean(o)
[1] 2
How do I get mean to return an array similar to var? I would expect in the
above example a
I have the following script, how can I implement to achieve that calculate
the VIF.
Thanks.
U1.7km<-c(15:24)
R<-c(1.2,0.2,3.6,2.5,4.8,6.3,2.3,4.1,7.2,6.1)
Hm<-c(1:10)
mod<-nls(R~a*(U1.7km^b)*(Hm^c), start=list(a=2.031, b=0.800, c=-0.255),
trace=T)
summary(mod)
coef(mod)
coef(summary(mod))
--
V
1 - 100 of 133 matches
Mail list logo