First of all, thanks for your answer!

About the optimization problem, I'm pretty careful on constraints for the
parameters. Regular papers usually do this kind of estimation just
restricting the weight of the mixture (bivariate) between 0 and 1, but this
can lead to some strange results, like negative variance. This kind of
constraint seems pretty easy to implement, since optim method L-BFGS-B
accepts lower and upper bounds for the parameters.

Other acceptable restriction is to set one variance as multiple of the
other, like sig1=k.sig2, with k different from 0 and infinite. The problem
is that I can't figure out how to implement this kind of constraint in R...

Since I have 6 parameters to estimate and usual bivariate gaussian mixture
problems just have 5, I guess packages designed for finite mixtures could be
used to reduce the dimension of the optimization, since by estimating the
variance of the first gaussian (sig1) via flexmix or mixtools I could
generate a (probably linear) constraint for my problem (since sig1 in my
estimation is a combination of two jump diffusion parameters). Is it easy to
declare linear constraints in R optimization?

Anyway, I really appreciate your help and attention.

Thankfully,

JC

2010/1/4 Ravi Varadhan [via R]
<ml-node+998666-298506...@n4.nabble.com<ml-node%2b998666-298506...@n4.nabble.com>
>

> "should write a function that uses the parameters and the sample data as
> input and outputs the likelihood. Is it correct?"
>
> Yes, that is correct. Take a look at the optim() function.  ?optim
>
> What type of convergence problems did you experience with Matlab?  I am not
>
> sure if using R can overcome fundamental modeling and computational issues,
>
> such as over-specification of the model for the data at hand.  But, may be
> you need to impose constraints on the parameter if you are fitting a
> Gaussian mixture.
>
> Another option is to use packages that are specially designed to model
> finite mixtures such as "flexmix" or "mixtools".
>
> Ravi.
>
> ----------------------------------------------------------------------------
>
> -------
>
> Ravi Varadhan, Ph.D.
>
> Assistant Professor, The Center on Aging and Health
>
> Division of Geriatric Medicine and Gerontology
>
> Johns Hopkins University
>
> Ph: (410) 502-2619
>
> Fax: (410) 614-9625
>
> Email: [hidden 
> email]<http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=0>
>
> Webpage:
>
> http://www.jhsph.edu/agingandhealth/People/Faculty_personal_pages/Varadhan.h
> tml
>
>
>
> ----------------------------------------------------------------------------
>
> --------
>
>
> -----Original Message-----
> From: [hidden 
> email]<http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=1>[mailto:[hidden
> email] <http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=2>]
> On
> Behalf Of jckval
> Sent: Monday, January 04, 2010 5:53 PM
> To: [hidden 
> email]<http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=3>
> Subject: [R] MLE optimization
>
>
> Folks,
>
> I'm kind of newbie in R, but with some background in Matlab and VBA
> programming. Last month I was implementing a Maximum Likelihood Estimation
> in Matlab, but the algorithms didn't converge. So my academic advisor
> suggested using R. My problem is: estimate a mean reverting jump diffusion
> parameters. I've succeeded in deriving the likelihood function (which looks
>
> like a gaussian mixture) and it is implemented in R. My main doubts are
> related to the inputs and outputs that this function should generate, for
> instance, in Matlab this function should get the parameters as input and
> output the likelihood using the sample data (imported within the function).
>
> In order to make R optimizers to work I, apparently, should write a
> function
> that uses the parameters and the sample data as input and outputs the
> likelihood. Is it correct?
> Could someone reply with an example code which examplifies the type of
> function I should write and the syntax to optimize?
> Alternatively, could anyone suggest a good MLE tutorial and package?
>
> Thankfully,
>
> JC
> --
> View this message in context:
> http://n4.nabble.com/MLE-optimization-tp998655p998655.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> [hidden 
> email]<http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=4>mailing
>  list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> [hidden 
> email]<http://n4.nabble.com/user/SendEmail.jtp?type=node&node=998666&i=5>mailing
>  list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
>
> ------------------------------
>  View message @ http://n4.nabble.com/MLE-optimization-tp998655p998666.html
> To unsubscribe from MLE optimization, click here< (link removed) ==>.
>
>
>

-- 
View this message in context: 
http://n4.nabble.com/MLE-optimization-tp998655p998761.html
Sent from the R help mailing list archive at Nabble.com.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to