On 10/17/12 8:36 PM, Becksfort, Jared wrote:
> I see. I am planning to submit the EM fit for multivariate normal mixture
> models in the next couple of weeks (Math-817). A Gibbs sampling DP fit may
> be a bit further out. I am not opposed to allowing the number of components
> to change, but
I see. I am planning to submit the EM fit for multivariate normal mixture
models in the next couple of weeks (Math-817). A Gibbs sampling DP fit may be
a bit further out. I am not opposed to allowing the number of components to
change, but I also like the simplicity of this class. Whatever
The issue is that with a fixed number of components, you need to do
multiple runs to find a best fit number of components. Gibbs sampling
against a Dirichlet process can get you to the same answer in about the
same cost as a single run of EM with a fixed number of models.
On Wed, Oct 17, 2012 at
Ted,
I am not sure I understand the problem with the fixed number of components. My
understanding is that CM prefers immutable objects. Adding a component to an
object would require reweighting in addition to modifying the component list.
A new mixture model could be instantiated using the ge
Seems fine.
I think that the limitation to a fixed number of mixture components is a
bit limiting. So is the limitation to a uniform set of components. Both
limitations can be eased without a huge difficultly.
Avoiding the fixed number of components can be done by using some variant
of Dirichle
Hello.
Any objection to commit the code as proposed on the report page?
https://issues.apache.org/jira/browse/MATH-816
Regards,
Gilles
-
To unsubscribe, e-mail: dev-unsubscr...@commons.apache.org
For additional commands, e-ma
2012/10/17 sebb :
> On 16 October 2012 21:56, Benedikt Ritter wrote:
>> 2012/10/16 Gary Gregory :
>>> On Tue, Oct 16, 2012 at 1:00 PM, Stephen Colebourne
>>> wrote:
>>>
On 16 October 2012 17:44, Matt Benson wrote:
> On Tue, Oct 16, 2012 at 11:42 AM, James Carman
> wrote:
>>