On Sun, Jul 4, 2010 at 9:10 PM, happy naren wrote:
> Hi,
> i am currently working on a problem where i need to plot latitude and
> longitude data on a respective county of UK. After this i want to plot
> altitude data to make a 3d surface on which then i have to plot my
> corresponding data.
Hav
Hi experts,
currently developing some code that checks a large amount of Strings
for the existence of sub-strings and pattern (detecting sub-strings
within URLs). I wonder if there is information about how well
particular String operations work in R together with comparisons. Are
there recommenda
Hi,
is there such a thing as a profiler for R that informs about a) how
much processing time is used by particular functions and commands and
b) how much memory is used for creating how many objects (or types of
data structures)? In a way I am looking for something similar to the
java profiler (wh
Hello everyone,
using the VGAM package and the following code
library(VGAM)
bp1 <- vglm(cbind(daten$anzahl_b, daten$deckung_b) ~ ., binom2.rho,
data=daten1)
summary(bp1)
coef(bp1, matrix=TRUE)
produced this error message:
error in object$coefficients : $ operator not defined for
Please allow me to expand on Prof. Venables' reply. Here is a comparison of
various approaches to solving (or inverting) a linear system:
> set.seed(123)
> n <- 1000
>
> amat <- matrix(rnorm(n*n), n, n)
>
> amat <- amat %*% t(amat) # a symmetric positive-definite matrix
>
> x <- 1:n # corre
On Sun, Jul 4, 2010 at 12:59 PM, Ben Wilkinson wrote:
> I have put together a chart of 1,000 monthly data series using parallel and
> I really like the way it displays the data. Is there a way to achieve
> something similar in terms of display using the actual scale ( consistently
> across all the
On Jul 4, 2010, at 7:12 PM, Martin Dózsa wrote:
Hi all,
I have the following problem: I need to run a large number of
simulations,
therefore I use many computers. After the computations I need to
make some
operations with the obtained results (such as weighted average, sum,
etc.).
My q
ginv() is slower than solve(). This is the price you pay for more generality.
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of song song
Sent: Monday, 5 July 2010 10:21 AM
To: r-help@r-project.org
Subject: [R] if using ginv function
since ginv can deal with both singular and non-singular conditions, is there
any other difference between them?
if I use ginv only, will be any problem?
thanks
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://s
Hi all,
I have the following problem: I need to run a large number of simulations,
therefore I use many computers. After the computations I need to make some
operations with the obtained results (such as weighted average, sum, etc.).
My question is, how is it possible to combine the output of sev
It's a bit like gravy. You have to make it, it doesn't "just come" when you
cook the joint.
Here is a mock-up of your situation:
> dat <- data.frame(ByMonth = gl(6, 10, labels = paste("2010-0", 1:6, sep="")),
+ r1 = rnorm(60), v = rnorm(60), r = rnorm(60))
> head(dat)
ByMonth r1
You can look at the package source in the `cubature_1.0.tar.gz' file (which you
can get from CRAN), especially the `cubature.c' file.
Ravi.
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontol
When regressing by month, how do I get the coefficients out into a new data
set?
I'm looking for
[ month, a, b, c ]
from the Pastor-Stambaugh model I'm using which is:
r[i+1] = a + b * r[i] + c * v[i] + e
the model I'm using wants to create a new dataseries based on the
coeffic
On 2010-07-04 5:29, Mark Carter wrote:
I'm not very good at statistics, but I know enough to be dangerous. I'm
completely new to R, having just discovered it yesterday. Now that the
introductions are out of the way ...
I have a table with three columns, only two of which are relevant to the
d
Hi,
i am currently working on a problem where i need to plot latitude and
longitude data on a respective county of UK. After this i want to plot
altitude data to make a 3d surface on which then i have to plot my
corresponding data.
Please Help...
thanks in advance
Narender
[[alternative
Hi Mark,
Try this to get you started:
table(roe1 > median(roe1), roe0 > median(roe0))
Hadley
On Sun, Jul 4, 2010 at 6:29 AM, Mark Carter wrote:
> I'm not very good at statistics, but I know enough to be dangerous. I'm
> completely new to R, having just discovered it yesterday. Now that the
>
I have put together a chart of 1,000 monthly data series using parallel and
I really like the way it displays the data. Is there a way to achieve
something similar in terms of display using the actual scale ( consistently
across all the data) as opposed to min/max ?
Thanks
[[alternative H
I'm not very good at statistics, but I know enough to be dangerous. I'm
completely new to R, having just discovered it yesterday. Now that the
introductions are out of the way ...
I have a table with three columns, only two of which are relevant to the
discussion: roe0 and roe1. Plotting roe0 a
On 07/03/2010 10:59 PM, Brad McNeney wrote:
> The following call to curl fetches me the information I want (as html)
> from a webserver:
>
> curl -F list_fi...@snptxt -F html_output=on
> http://integrin.ucd.ie/cgi-bin/rs2cm.cgi
>
> The file snptxt is a plain-text file in my working directory with
On Sat, 2010-07-03 at 17:27 -0700, Roger Deangelis wrote:
> Hi Bernado,
>
> In many financial applications if you convert the dollars and cents to
> pennies( ie $1.10 to 110) and divide by 100 at the vary end you can get
> maintain higher precision. This applies primarily to sums. This is similar
Good day R-listers,
i'm running a panel data regression and after performing the haussman test
the conclusion was that my model is a fixed effect one. The problem is
located on my explanatory variables which display week variations, and
as it is well known fixed effect model gives weak results in a
On Jul 4, 2010, at 7:37 AM, David Winsemius wrote:
On Jul 3, 2010, at 11:33 PM, Changbin Du wrote:
HI, Dear community,
I am using the linear discriminant analysis to build model and make
new
predictions:
dim(train) #training data
[1] 1272 22
dim(valid) # validation data
[1] 140
On Jul 3, 2010, at 11:33 PM, Changbin Du wrote:
HI, Dear community,
I am using the linear discriminant analysis to build model and make
new
predictions:
dim(train) #training data
[1] 1272 22
dim(valid) # validation data
[1] 140 22
lda.fit <- lda(out ~ ., data=train, na.action="n
On 2010-07-03 21:33, Changbin Du wrote:
HI, Dear community,
I am using the linear discriminant analysis to build model and make new
predictions:
dim(train) #training data
[1] 1272 22
dim(valid) # validation data
[1] 140 22
lda.fit<- lda(out ~ ., data=train, na.action="na.omit", CV=TR
Thank your for your help!
Yes..I'm using arima from stats package...
I'm using other commands from tseries package.
Sabrina
--
View this message in context:
http://r.789695.n4.nabble.com/s-e-of-arima-tp2275282p2277550.html
Sent from the R help mailing list archive at Nabble.com.
_
Hello Angelo,
You can either supply the arm-level outcomes and corresponding sampling
variances directly (via the yi and vi arguments) or supply the necessary
information so that the escalc() or rma() functions can calculate an
appropriate arm-level outcome (such as the log odds). See the docum
Hi Bernado,
In many financial applications if you convert the dollars and cents to
pennies( ie $1.10 to 110) and divide by 100 at the vary end you can get
maintain higher precision. This applies primarily to sums. This is similar
to keeping track of the decimal fractions which have exact represen
Hello,
You can find a list of some R books here
http://www.r-project.org/doc/bib/R-books.html. I found Software for
Data Analysis particularly helpful for learning about how R worked and
some of the concepts.
Here is the reference and a link to it at Amazon
John M. Chambers. Software for Data An
28 matches
Mail list logo