Wijffels, Jan <jan.wijffels <at> thomascook.be> writes: > > Hi > > I'm quite new to optimization algorithms and I could use some advice or > pointers. I'm using ?optim (method L-BFGS-B) to optimize a function over > a 60-dimensional parameter space. The function itself takes about 1 to 6 > minutes to compute. It finds an optimum after 6 tot 24 hours, depending > on the problem I need to solve. For me, speed is an issue and I would > like to reduce this to 1 hour. > > Hence my question:: are there optimization algorithms that are > particularly useful for settings where the evaluation of the function > takes a lot of time to compute?
Others may have more specific advice, but here is some general information: * the more information from the function you can use, the better. In this respect the derivative-based methods (BFGS, L-BFGS-B) will be more efficient than Nelder-Mead. If there is any way you can get analytic derivatives of your function, that will help a lot. * Your best bet is probably to look at your objective function and see if you can speed it up. Can you vectorize or code the time-consuming bits of it in C/FORTRAN? * For large-scale optimization, you can potentially get a very large improvement in speed by using automatic differentiation, which exists in R only in a rudimentary form. Depending on the resources you have available, I would seriously consider AD Model Builder from Otter Research. * I haven't tried it, but you might take a quick look at http://www.milbo.users.sonic.net/ra/index.html, particularly if your code involves non-vectorizable loops. good luck, Ben Bolker ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.