Hello,
lm() is designed to work with data.frames, not with matrices. You can
change your code to something like
dat <- data.frame(price, pred1 = c(5,6,3,4,5), pred2 = c(2,1,8,5,6))
fit <- lm(price ~ pred1 + pred2, data = dat)
and then use the fitted model to do predictions. You don't have to
Hi,
I'd do it like this, making use of data frames and the data argument to lm.
traindata <- data.frame(price=price, predictor1=predictor1,
predictor2=predictor2)
testdata <- data.frame(predictor1=3, predictor2=5)
predict(lm(price ~ ., data=traindata), testdata)
Note that you don't have to speci
Solved! Here is the solution in case it helps others:
The easiest way to get past the issue of matching up variable names from a
matrix of covariates to newdata data.frame column names is to put your
input data into a data.frame as well. Try this
price = c(10,18,18,11,17)
predictors = cbind(c(5,6
I want to perform a multiple regression in R and make predictions based on
the trained model. Below is an example code I am using:
price = c(10,18,18,11,17)
predictors = cbind(c(5,6,3,4,5),c(2,1,8,5,6))
predict(lm(price ~ predictors), data.frame(predictors=matrix(c(3,5),nrow=1)))
So, based on th
On Tue, Sep 3, 2013 at 2:51 AM, Christoph Scherber
wrote:
> Dear all,
>
> I´ve played around with the "airquality" dataset, trying to solve the matrix
> equations of a simple
> multiple regression by hand; however, my matrix multiplications don´t lead to
> the estimates returned
> by coef().
Hi Christoph,
ginv() computes the Moore-Penrose generalized inverse by way of a
singular value decomposition. Part of the calculation involves taking
the reciprocal of the non zero values. In practice, non zero is
really "within some precision tolerance of zero". Numerical precision
can bite yo
Dear all,
But why are there such huge differences betwen solve() and ginv()? (see code
below)?
##
m1=lm(Ozone~Solar.R*Wind,airquality)
# remove NA´s:
airquality2=airquality[complete.cases(airquality$Ozone)&
complete.cases(airquality$Solar.R)&
complete.cases(airquality$Wind),]
# create the mode
Hi Christoph,
Use this matrix expression instead:
solve(crossprod(X)) %*% t(X) %*% Y
Note that:
all.equal(crossprod(X), t(X) %*% X)
Cheers,
Joshua
On Tue, Sep 3, 2013 at 2:51 AM, Christoph Scherber
wrote:
> Dear all,
>
> I´ve played around with the "airquality" dataset, trying to solve th
Dear all,
I´ve played around with the "airquality" dataset, trying to solve the matrix
equations of a simple
multiple regression by hand; however, my matrix multiplications don´t lead to
the estimates returned
by coef(). What have I done wrong here?
##
m1=lm(Ozone~Solar.R*Wind,airquality)
# re
On May 11, 2012, at 9:13 AM, Rosario Garcia Gil wrote:
Hello
It is possible to set up an lm() model where none of the categories
of the categorical independent variable need to be used as
references, I mean use the total mean instead.
Yes.
?contrasts
--
David Winsemius, MD
Heritage Lab
?contr.sum
On Fri, May 11, 2012 at 8:13 AM, Rosario Garcia Gil
wrote:
> Hello
>
> It is possible to set up an lm() model where none of the categories of the
> categorical independent variable need to be used as references, I mean use
> the total mean instead.
>
> Thanks
> /R
> _
Hello
It is possible to set up an lm() model where none of the categories of the
categorical independent variable need to be used as references, I mean use the
total mean instead.
Thanks
/R
__
R-help@r-project.org mailing list
https://stat.ethz.ch/ma
Thanks for clearing that up
--
View this message in context:
http://r.789695.n4.nabble.com/Multiple-regression-intercept-tp3808045p3809458.html
Sent from the R help mailing list archive at Nabble.com.
__
R-help@r-project.org mailing list
https://stat.
This suggests that this is a dangerous office to be in because this is a
basic question. I am sure somebody in your office knows this. Anyway, the
baseline gives you the average value of the group that constitutes the
baseline when all other covariates are zero. Let's say you measure whether
men or
Hi I am having difficulty interpretive the multiple regression output. I
would like to know what it means when one of the factors is assigned as the
intercept?
In my data I am looking at the relationship between environmental parameters
and biological production.
One of my variables in the analys
Thankyou for your replies, you've answered my question and given me more to
think on. I guess it is unwise to draw any conclusions from the
standardised results for these reasons.
James.
--On 22 August 2011 17:30 +0100 ted.hard...@wlandres.net wrote:
On 22-Aug-11 15:37:40, JC Matthews wrote
On Tue, Aug 23, 2011 at 7:54 AM, JC Matthews wrote:
> Thankyou for your replies, you've answered my question and given me more to
> think on. I guess it is unwise to draw any conclusions from the
> standardised results for these reasons.
No, by all means try to draw conclusions! Isn't that the p
On 22-Aug-11 15:37:40, JC Matthews wrote:
> Hello,
>
> I have a statistical problem that I am using R for, but I am
> not making sense of the results. I am trying to use multiple
> regression to explore which variables (weather conditions)
> have the greater effect on a local atmospheric variable.
Hi JC,
You have interactions in your model, which means that your models
specifies that the coefficients for hum, wind, and rain should vary
depending on the value of the other two (and depending on their own
value actually, since you also have quadratic effects for each of
these variables in your
Hello,
I have a statistical problem that I am using R for, but I am not making
sense of the results. I am trying to use multiple regression to explore
which variables (weather conditions) have the greater effect on a local
atmospheric variable. The data is taken from a database that has 20391
On Feb 19, 2011, at 12:03 PM, Uwe Ligges wrote:
On 18.02.2011 11:02, Rosario Garcia Gil wrote:
Hello
I have a multiple linear regression with two cofactors, I would
like to represent a plane but I could not find any help which
worked out.
Any suggestions.
One way is explained in:
l
On 18.02.2011 11:02, Rosario Garcia Gil wrote:
Hello
I have a multiple linear regression with two cofactors, I would like to
represent a plane but I could not find any help which worked out.
Any suggestions.
One way is explained in:
library("scatterplot3d")
?scatterplot3d
Now see example
Hello
I have a multiple linear regression with two cofactors, I would like to
represent a plane but I could not find any help which worked out.
Any suggestions.
Regards and thanks in advance.
Rosario
__
R-help@r-project.org mailing list
https://stat.e
Hi all,
This is more a help on ideas that on actuall R code. Those of you which are
geologist or work with boreholes woudl understand that i would like to achieve.
I'll tr to explain as good as I can...so here it goes..
I have around 1000 geomechanical (geological) borehole logs in a
Joel Fürstenberg-Hägg said:
"Multiple linear regression [...] I would like to check every possible
combination of factors, evalute the results based for instance on their p
values, and then choose the best regression model."
By "every possible combination of factors", I assume you mean that for
Hi all,
I'm doing Multiple linear regression for a data set. However, it takes a lot of
time, as I would like to check every possible combination of factors, evalute
the results based for instance on their p values, and then choose the best
regression model.
So, I wonder if anyone might have
Dear all,
I have a problem when trying to fit a model to a dataset. The model I am
fitting is: y~var1*var2*var3*var4*var5, with var3 and var4 having many 0´s.
When I construct the model and I simplify it, it remains a model with 8
explicative variables (or interaction of them) and the intercept,
On Mon, 29 Jun 2009, John Hunter wrote:
But my question was more numerical: in particular, the R^2 of the
model should be equal to the square of the correlation between the fit
values and the actual values.
No.
It is with the intercept and is not w/o
it, as my code example shows. Am I corr
On Sun, Jun 28, 2009 at 3:38 AM, Dieter
Menne wrote:
> It seems odd to me that dropping the intercept
> would cause the R^2 and F stats to rise so dramatically, and the p
> value to consequently drop so much. In my implementation, I get the
> same beta1 and beta2, and the R2 I compute using the
I am writing some software to do multiple regression and am using r to
benchmark the results. The results are squaring up nicely for the
"with-intercept" case but not for the "no-intercept" case. I am not
sure what R is doing to get the statistics for the 0 intercept case.
...
It seems odd to
I am writing some software to do multiple regression and am using r to
benchmark the results. The results are squaring up nicely for the
"with-intercept" case but not for the "no-intercept" case. I am not
sure what R is doing to get the statistics for the 0 intercept case.
For example, I would ex
Linda,
At 4:36 PM -0700 8/15/08, Linda Zientek wrote:
Hello,
In SPSS, a multiple regression can be conducted by inputting the
means, standard deviations, sample size, and correlation matrix
without actually using the raw dataset. Is it possible to do the
same in R?
Thanks in advance for yo
On Fri, 15 Aug 2008, Linda Zientek wrote:
Hello,
In SPSS, a multiple regression can be conducted by inputting the means,
standard deviations, sample size, and correlation matrix without
actually using the raw dataset. Is it possible to do the same in R?
Yes, it is possible, up to a point (
Hello,
In SPSS, a multiple regression can be conducted by inputting the means,
standard deviations, sample size, and correlation matrix without actually using
the raw dataset. Is it possible to do the same in R?
Thanks in advance for your assistance.
Linda
[[alternative HTML version
Hi,
I have only recently started to use the R package and I was wondering how to
perform a multiple regression on a set of data were there is no intercept
and a condition that none of the coefficients can be less than zero.
I would appreciate if anybody could help!
Thanks
Ultan
--
View this me
35 matches
Mail list logo