Sorry for the double post - I accidentially hit my the "send" button...

We want to build a global optimizer for extremely expensive noisy objective 
functions based 
on physical measurements.
Our approach is based on optimizing kriging response surfaces surface models.
(as seen in www.cs.ubc.ca/~hutter/earg/stack/taxonomy-global_optimization.pdf)

In each iteration, we
        - model the response surface based on the measurements done so far 
using "Krig" 
from "fields".
        - find a parameter set that maximizes the probability of improvement on 
this model 
using "constrOptim" from multiple starting points.
        - measure the parameter set that maximizes the probability of 
improvement

My problem:
Unfortunately, the optimizations are extremely slow (~30s per iteration of the 
algorithm) 
despite not making excessive numbers of objective function evaluations (~1000 
per 
iteration).
I suspect this is because the many calls of predict.Krig and predict.se.Krig 
made by "optim" 
are slow, compared to a single call with many x values:

##example code:
library(fields)
fun<-function(x){return(sin(x*9)*1/((x)^2+0.02))}

#make some kriging model of a nasty objective function
x=matrix(seq(-1,1,0.5),ncol=1)
y=apply(x,2,fun)
kfit=Krig(x,y,cov.function=Exp.cov,p=2,theta=0.2)

#this is fast:
xp=matrix(seq(-1,1,0.001),ncol=1)
yp=predict.se(kfit,xp)
#this is _very_ slow:
for(i in 1:nrow(xp)){
  yp[i]=predict.se(kfit,xp[i,])
}
##end example code

My question:

how can I speed up the many single predictions so they can be used efficiently 
in an 
objective-function in an optimization?

thanks!

Felix Bonowski

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to