More of a general query, but looking to see if others have successfully used 
something like the foreach package (or other parallel style functions) with 
certain functions that minimize likelihood or objective functions (e.g., 
optim/nlminb).

I have had great success with embarrassingly parallel problems and my R 
packages have benefited greatly. However, optimization doesn't fit as nicely 
within that context as the values at iteration t depend on the values found at 
iteration t-1 and such. So, I'm assuming the cost of splitting and combining 
might be more expensive in this context that simply doing minimization on a 
single core.

If others have experiences or even possibly R-specific resources that implement 
this that I would be able to study, I would appreciate seeing how this might be 
implemented.

Regards
Harold

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to