Hi all,
I have problem to estimate a SETAR model. I always get an error message.
Here is the code:
## there are 4175 observation in the series (a).
> a[1:10,1] [1] 1.496498 1.496602 1.496636 1.496515 1.496515 1.496463 1.496429
> 1.496549 1.496480
[10] 1.496498
> library("tsDyn")
> selectSETAR
Hi,
Thanks a lot for answer. It is what I mean.
But the code does not seem to work (
Le Jul 19, 2012 à 8:52 AM, Petr Savicky [via R] a écrit :
> On Wed, Jul 18, 2012 at 06:02:27PM -0700, bilelsan wrote:
> > Leave the Taylor expansion aside, how is it possible to compute with [R]:
> > f(e) =
On 12-07-22 5:33 PM, arun wrote:
Hi Duncan,
That was my original suggestioin. His reply suggests that it is not that he
wanted.
I didn't see your reply. Maybe you sent it privately? In any case, I
think it is up to Sverre to give an example of what he wants, since your
suggestion, Weidon
Hi Valentin,
If the contamination is mainly in the response direction, M-estimator
provides good estimates for parameters and rlm can be used.
Rohan
-Original Message-
From: r-sig-robust-boun...@r-project.org
[mailto:r-sig-robust-boun...@r-project.org] On Behalf Of Valentin
Todorov
Sent:
Hi Phil,
I think you want:
merge(listA, listB, by = "NACE")
which will give you:
NACE Name aaa bbb ccc
11a a a c
21a a a c
31a a a c
42b a a c
52b a a c
63c a a c
If you want to get rid of the Name column, th
On Sun, 22 Jul 2012 22:05:14 -0700 Bert Gunter
wrote:
> >
> > On Sun, Jul 22, 2012 at 8:26 PM, Ranjan Maitra
> > wrote:
> >>
> >>> Just reset the levels of z$sigma (and also redefine sigmaExpr):
> >>>
> >>>z$sigma <- factor(z$sigma,
> >>>levels = c(5,10,20,30,50)) # new l
There's a typo below. It's Deepayan Sarkar.
-- Bert
On Sun, Jul 22, 2012 at 9:55 PM, Bert Gunter wrote:
> inline.
>
> -- Bert
>
> On Sun, Jul 22, 2012 at 8:26 PM, Ranjan Maitra
> wrote:
>>
>>> Just reset the levels of z$sigma (and also redefine sigmaExpr):
>>>
>>>z$sigma <- factor(z$sigma,
!!! Well, that strikes me as a fair bit of chutzpah. Some generous
soul may well respond, but why not do your own debugging using R's
debugging tools.It will serve you well in the long run to put in the
effort now to learn them. Debugging is a major part of any
programming.
?trace
?debug
?browser
Can someone verify for me if the for loop below is really calculating the
nonzero min for each row of a matrix? I have a bug somewhere in the is
section of code. My first guess is how I am find the the nonzero min of each
row of my matrix. The overall idea is to make sure I am investing all of my
m
> Just reset the levels of z$sigma (and also redefine sigmaExpr):
>
>z$sigma <- factor(z$sigma,
>levels = c(5,10,20,30,50)) # new levels order
>
>sigmaExprList <- lapply(as.numeric(levels(z$sigma)),
>function(s) bquote(sigma == .(s)))
>
On 2012-07-22 19:09, Ranjan Maitra wrote:
On Sun, 22 Jul 2012 18:58:39 -0700 Peter Ehlers
wrote:
On 2012-07-22 18:03, Ranjan Maitra wrote:
[I had to dig back to see what your Q2 was. It's good to keep context.]
Try this:
p <- bwplot(Error~Method | sigma + INU, data = z,
s
Hello,
image(1:100, 1:100, x)
Regards,
Pascal
Le 23/07/12 11:28, li li a écrit :
Dear all,
I have a question regarding changing the xlim and ylim in the function
image().
For example, in the following code, how can I have a heatmap with
xlim=ylim=c(0, 100)
instead of (0,1).
Thank
Dear all,
I have a question regarding changing the xlim and ylim in the function
image().
For example, in the following code, how can I have a heatmap with
xlim=ylim=c(0, 100)
instead of (0,1).
Thank you very much.
x <- matrix(rnorm(1, 0,1), 100, 100)
image(x)
Hannah
On Sun, 22 Jul 2012 18:58:39 -0700 Peter Ehlers
wrote:
> On 2012-07-22 18:03, Ranjan Maitra wrote:
> >
> [I had to dig back to see what your Q2 was. It's good to keep context.]
>
> Try this:
>
> p <- bwplot(Error~Method | sigma + INU, data = z,
> scale
On 2012-07-22 18:03, Ranjan Maitra wrote:
[I had to dig back to see what your Q2 was. It's good to keep context.]
Try this:
p <- bwplot(Error~Method | sigma + INU, data = z,
scales = list(rot=90), horiz = FALSE,
layout = c(5,3), col = "red")
require(latticeExtra
Dear R help,
Does no one have an idea of where I might find information that could help
me with this problem? I apologize for re-posting - I have half a suspicion
that my original message did not make it through.
I hope you all had a good weekend and look forward to your reply,
MO
On Fri, Jul
> >> [I had to dig back to see what your Q2 was. It's good to keep context.]
> >>
> >> Try this:
> >>
> >>p <- bwplot(Error~Method | sigma + INU, data = z,
> >> scales = list(rot=90), horiz = FALSE,
> >> layout = c(5,3), col = "red")
> >>
> >>require(latticeExtra)
> >>
Dear Henrik,
On Mon, 23 Jul 2012 00:56:16 +0200
Henrik Singmann wrote:
> Dear John,
>
> indeed, you are very right. Including the covariate as is, doesn't make any
> sense. The only correct way would be to center it on the mean beforehands. So
> actually the examples in my first and second ma
On 2012-07-22 15:58, Ranjan Maitra wrote:
On Sun, 22 Jul 2012 15:04:36 -0700 Peter Ehlers
wrote:
On 2012-07-22 09:02, Ranjan Maitra wrote:
Dear friends,
Many thanks to Jim (Holtman) and David (Carlson) for their quick
responses: Q1 is now solved. There are two almost equivalent ways for
doin
On Sun, 22 Jul 2012 15:04:36 -0700 Peter Ehlers
wrote:
> On 2012-07-22 09:02, Ranjan Maitra wrote:
> > Dear friends,
> >
> > Many thanks to Jim (Holtman) and David (Carlson) for their quick
> > responses: Q1 is now solved. There are two almost equivalent ways for
> > doing this. They follow:
> >
Dear John,
indeed, you are very right. Including the covariate as is, doesn't make any
sense. The only correct way would be to center it on the mean beforehands. So
actually the examples in my first and second mail are bogus (I add a corrected
example at the end) and the reported test do not m
On 2012-07-22 13:09, Henrik Singmann wrote:
Hi Mary,
I think the good old t-test is what you want:
Maybe, but calculating p-values with absolutely no consideration
of assumptions is pure folly. It may well be that Mary has some
assumptions in mind, but the way the question was posed does not
i
On 2012-07-22 09:02, Ranjan Maitra wrote:
Dear friends,
Many thanks to Jim (Holtman) and David (Carlson) for their quick
responses: Q1 is now solved. There are two almost equivalent ways for
doing this. They follow:
library(lattice)
z <- rbind(cbind(z, 0), cbind(z, 20), cbind(z, 40))
z <- cbind
Dear Henrik,
The within-subjects contrasts are constructed by Anova() to be orthogonal in
the row-basis of the design, so you should be able to safely ignore the effects
in which (for some reason that escapes me) you are uninterested. This would
also be true (except for the estimated error) for
On 12-07-22 3:37 PM, Mary Kindall wrote:
I have a value
a=300
observation (x) = sample(1:50)
How to find a p-value from this. I need to show that "a" is different fom
mean(x).
Thanks
This question doesn't really make sense. sample(1:50) gives you the
same sample as 1:50 does, just in a dif
On 12-07-22 12:27 PM, Sverre Stausland wrote:
reorder() is probably the best way to order the levels in a vector
without manually specifying the order. But reorder() orders by default
in an increasing order: "The levels are ordered such that the values
returned by ‘FUN’ are in increasing order."
Hi,
Try this:
dat1<-read.table(text="
NACE aaa bbb ccc
1 a a c
1 a a c
1 a a c
2 a a c
2 a a c
3 a a c
4 a a c
4 a a c
4 a a c
",sep="",header=TRUE)
dat2<-read.table(text="
Name NACE
a 1
b 2
c 3
",sep="",header=TRUE)
dat3<-merge(dat1,dat2)
dat3<-dat3[,1:4]
dat3
NACE aaa bbb ccc
1 1 a
HI,
Probably ?pnorm
x1<-mean(x)
x1
[1] 25.5
> pnorm(25.5,mean=300)
[1] 0
A.K.
- Original Message -
From: Mary Kindall
To: r-help@r-project.org
Cc:
Sent: Sunday, July 22, 2012 3:37 PM
Subject: [R] pvalue calculate
I have a value
a=300
observation (x) = sample(1:50)
How to find
By looking at your output, it didn't change the order of the levels.
(This is symptomatic for how difficult it is to change levels in R in
any automatic way).
On Sun, Jul 22, 2012 at 7:31 PM, arun wrote:
> Hi,
>
> Not sure if this helps or not.
>
> with(InsectSprays, spray[order(count)])
> [1]
Not quite. It still orders the values in an increasing order, you've
just changed the values here. I'm using reorder() to prepare for
plotting the values, so I can't change the values.
On Sun, Jul 22, 2012 at 6:51 PM, arun wrote:
> Hi,
>
> I hope this is what you are looking for.
>
>
> bymean1<-
Hi everybody,
I am currently quite inexperienced with R.
I try to create a function that simply take a value in a dataframe, look for
this value in another dataframe and copy all the row that have this value
This example is a simplified version of what I am doing but it's enough to
help me
listA
On 2012-07-17 05:13, R. Michael Weylandt wrote:
On Mon, Jul 16, 2012 at 3:39 PM, Oxenstierna wrote:
lapply(thing, function(x) x[['p.value']]) --works very well, thank you.
Not to be a chore, but I'm interested in comparing the results of
wilcox.test--and the methodology we've employed so fa
Hi Mary,
I think the good old t-test is what you want:
x <- sample(1:50)
t.test(x, mu = 300)
gives:
One Sample t-test
data: x
t = -133.2, df = 49, p-value < 0.00022
alternative hypothesis: true mean is not equal to 300
95 percent confidence interval:
21.36 29.64
sample
Dear John,
thanks for your response. But if I simply ignore the unwanted effects, the
estimates of the main effects for the within-subjects factors are distroted
(rationale see below). Or doesn't this hold for between-within interactions?
Or put another way: Do you think this approach is the c
I have a value
a=300
observation (x) = sample(1:50)
How to find a p-value from this. I need to show that "a" is different fom
mean(x).
Thanks
--
-
Mary Kindall
Yorktown Heights, NY
USA
[[alternative HTML version deleted]]
__
R-he
Best to ask questions about Bioconductor packages on the bioconductor mailing
list.
---
Jeff NewmillerThe . . Go Live...
DCN:Basics: ##.#. ##.#. Live Go...
Sverre,
have you tried to put minus(-) in front of the variable by which you
order the other?
weidong
On Sun, Jul 22, 2012 at 12:27 PM, Sverre Stausland
wrote:
> reorder() is probably the best way to order the levels in a vector
> without manually specifying the order. But reorder() orders by d
Hello!
I am interested in creating contingency tables, namely one that would
let me find the frequency and proportion of patients with specific
risk factors (dyslipidemia, diabetes, obesity, smoking, hypertension).
There are 3 dimensions I would like to divide the population into:
sex, family hist
reorder() is probably the best way to order the levels in a vector
without manually specifying the order. But reorder() orders by default
in an increasing order: "The levels are ordered such that the values
returned by ‘FUN’ are in increasing order."
Is there a way to do what reorder() does, but o
Dear friends,
Many thanks to Jim (Holtman) and David (Carlson) for their quick
responses: Q1 is now solved. There are two almost equivalent ways for
doing this. They follow:
library(lattice)
z <- rbind(cbind(z, 0), cbind(z, 20), cbind(z, 40))
z <- cbind(z, rnorm(n = nrow(z)))
z <- as.data.frame(z
On Jul 22, 2012, at 14:45 , Rui Barradas wrote:
> Hello,
>
> See if this is it.
>
>
> returns <- rnorm(10)
> dummy <- ifelse(returns < 0, -1, 0)
Sara had "1 if results are negative", so lose the minus. It is easier just to
say
dummy <- as.numeric(returns < 0)
-pd
>
>
> Hope this helps
Hi,
I was using Gviz package to create a boxplot. I understand that Gviz uses
"panel.bwplot" to create the boxplot.
Is there any way that I can remove the dashed line surrounding each pair of
boxplots?
Here is some sample code:
#
library(Gviz)
thisdata <- matrix(sample(1:100,60),n
Dear Henrik,
As you discovered, entering the covariate age additively into the
between-subject model doesn't prevent Anova() from reporting tests for the
interactions between age and the within-subjects factors. I'm not sure why you
would want to do so, but you could simply ignore these tests.
Sara:
Are you sure?? I am wholly unfamiliar with garch, but in general, R
does not need dummy variables at all. You make your covariate a factor
with appropriate contrasts and then write an appropriate model
formula, in this case, with an interaction with your series.
I could be wrong in this cas
Check and see if you have reshape loaded as well. I had a somewhat similar
problem (R2.13 ?) and realised that reshape was masking reshape2
John Kane
Kingston ON Canada
> -Original Message-
> From: dwarnol...@suddenlink.net
> Sent: Sat, 21 Jul 2012 16:06:11 -0700 (PDT)
> To: r-help@r-p
Hello,
See if this is it.
returns <- rnorm(10)
dummy <- ifelse(returns < 0, -1, 0)
Hope this helps
Rui Barradas
Em 22-07-2012 08:53, saraberta escreveu:
Hi,
i need a little help! i must create a dummy variable to insert as external
regressor in the variance equation of a garch model; this
Hi,
i need a little help! i must create a dummy variable to insert as external
regressor in the variance equation of a garch model; this dummy is referred
to the negative sign of returns of an asset, so it has to be 1 when returns
are negative and 0 when they are positive, and in my model the dummy
hi Greg, David, and Tal,
Thank you very much for the information.
I found this in SPSS 17.0 missing value manual:
EM Method
This method assumes a distribution for the partially missing data and bases
inferences
on the likelihood under that distribution. Each iteration consists of an E step
an
Hi,
Try this:
test1<-read.table(text="
Fkh2 0.141 0.242 0.342
Swi5 0.224 0.342 0.334
Sic1 0.652 0.682 0.182
",sep="",header=FALSE)
test1
# V1 V2 V3 V4
#1 Fkh2 0.141 0.242 0.342
#2 Swi5 0.224 0.342 0.334
#3 Sic1 0.652 0.682 0.182
geneLabel<-test1[,1]
expValues<-as.matrix(t
How to adapt this piece of code but for: - gamma distribution - 3 parameter
log normal
More specifically, where can I find the specification of the parameter
(lmom) for pelgam() and pelln3()?
Lmom package info just gives: pelgam(lmom), lelln3(lmom), where lmom is a
numeric vector containing the L-
Hello,
Try using the vector geneLabel. You don't need to convert to list.
rownames(expValues) <- geneLabel # if this doesn't work
rownames(expValues) <- as.vector(geneLabel)
But I really don't see a problem with the first. Have you tried it?
Hope this helps,
Rui Barradas
Em 22-07-2012 03:04,
51 matches
Mail list logo