Jason Rennie wrote: > > I hung-out in the machine learning community appx. 1999-2007 and > thought the Salakhutdinov work was extremely refreshing to see after > listening to no end of papers applying EM to whatever was the hot > topic at the time. :)
Isn't it true for any general framework who enjoys some popularity :) > I've certainly seen/heard about various fixes to EM, but I haven't > seen convincing reason(s) to prefer it over proper gradient > descent/hill climbing algorithms (besides its present-ability and ease > of implementation). I think there are cases where gradient methods are not applicable (latent models where the complete data Y cannot be split into observations-hidden (O, H) variables), although I am not sure that's a very common case in machine learning, cheers, David _______________________________________________ Numpy-discussion mailing list Numpy-discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion