Dear group
IF I had an objective function and some constrains formed in linear model form.
is there a way,..library in R that helps me to solve such amodel and find the
unknown variable in it?
thanks in advance
Ragia
__
It is perhaps worth mentioning that the OP's desire to do the
conversion to numeric to calculate won-lost percentages is completely
unnecessary and indicates that he/she could benefit by spending some
additional time learning R. See, e.g. ?tapply, ?table, ?prop.table,
and friends.
Cheers,
Bert
B
I used your code but deleted sep="\t" since there were no tabs in your email
and added the fill= argument I mentioned before.
David
Original message
From: Ashta
Date: 11/14/2015 6:40 PM (GMT-06:00)
To: David L Carlson
Cc: R help
Subject: Re: [R] Ranking
Thank you David,
You need to read about S3 classes, and either make your custom function behave
the way that function needs to behave or use a different function name for your
custom function.
I think this is an example of the old saying that if it hurts when you slam
your head against the wall, then don't do t
Think about it.
I shall assume that you are familiar with S3 methods. What do you
think would happen when xyplot code calls is.numeric() on a factor
object expecting it to call the is.numeric primitive but, instead,
finding a factor method defined, calls that? Note that your factor
method has no
Hi,
Pretty much everything is in the title of the post. An example is below.
library(lattice)
data <-
data.frame(x=rep(1:10,8),y=rnorm(80),trt=factor(rep(1:4,each=20)),groups=rep(1:8,each=10))
xyplot <- xyplot(y~x|trt,data,groups=groups)
is.numeric.factor <- function(){
print('hello world')
In econometrics it was common to start an optimization with Nelder-Mead and
then switch to one of the other algorithms to finish the optimization. As
John Nash states NM gets one close. switching then speeds the final
solution.
John
John C Frain
3 Aranleigh Park
Rathfarnham
Dublin 14
Ireland
www.
In respect of Bert Gunter's signature quote:
"Data is not information. Information is not knowledge. And knowledge
is certainly not wisdom."
The other day my wife saw a grocery truck with the following remark
emblazoned on its side:
"Knowledge is being aware that a tomato is a fruit. Wisdom
and just to add to john's comments, since he's too modest, in my
experience, the algorithm in the rvmmin package ( written by john ) shows
great improvement compared to the L-BFGS-B algorithm so I don't use
L-BFGS-B anymore. L-BFGS-B often has a dangerous convergence issue in
that it can claim
Agreed on the default algorithm issue. That is important for users to
know, and I'm happy to underline it. Also that CG (which is based on one
of my codes) should be deprecated. BFGS (also based on one of my codes
from long ago) does much better than I would ever have expected.
Over the years I've
Hi John,
My main point is not about Nelder-Mead per se. It is *primarily* about the
Nelder-Mead implementation in optim().
The users of optim() should be cautioned regarding the default algorithm and
that they should consider alternatives such as "BFGS" in optim(), or other
implementations o
Hi,
While I agree with the comments about paying attention to parameter scaling, a
major issue here is that the default optimization algorithm, Nelder-Mead, is
not very good. It is unfortunate that the optim implementation chose this as
the "default" algorithm. I have several instances wher
Dear Andrew,
Thank you for your reply. Its an R question. The weeks are coded as 1-53
for each year and I would like to control weeks and years as time fixed
effects.
Will this be an issue if I estimate this type of regression using the LFE
package?
felm(outcome ~ temperature + precipitation | c
Not contradicting Ravi's message, but I wouldn't say Nelder-Mead is
"bad" per se. It's issues are that it assumes the parameters are all on
the same scale, and the termination (not convergence) test can't use
gradients, so it tends to get "near" the optimum very quickly -- say
only 10% of the compu
Thanks a lot Jim and Boris for replying.
Sent from my iPhone
> On Nov 9, 2015, at 1:13 AM, jim holtman wrote:
>
> You need to take a close look at the function incomb that you are creating.
> I see what appears to be a constant value
> ("*(gamma((1/beta)+1))*((alpha)^(-(1/beta)))") being com
Thanks a lot, Ravi.
Indeed you best understood the point of my email.
I am perfectly aware that most of the optimization algorithms find
local rather than global minima and therefore the choice of the
initial parameters plays (at least in principle) a role.
Nevertheless, my optimization problem is
I don't see why not, but I also don't see why you need to take my word
for it when you can compare the output of felm against the output of lm,
with dummy variables for all the factors. If that many dummies is
computationally tough, just work with a subset.
On 11/15/2015 08:37 AM, Miluji Sb wr
17 matches
Mail list logo