On Sun, 13 Aug 2023, Hans W writes:
> While working on 'random walk' applications, I got interested in
> optimizing noisy objective functions. As an (artificial) example, the
> following is the Rosenbrock function, where Gaussian noise of standard
> deviation `sd = 0.01` is added to the function v
More to provide another perspective, I'll give the citation of some work
with Harry Joe and myself from over 2 decades ago.
@Article{,
author = {Joe, Harry and Nash, John C.},
title = {Numerical optimization and surface estimation with imprecise
function evaluations},
journal = {Statist
Thanks, Ben.
For certain reasons, I would *not* like to apply global optimization solvers,
e.g., for reasons of higher dimensions and longer running times.
I was hoping for suggestions from the "Stochastic Programming" side.
And please, never suggest `optim` with method "SANN".
See the Optimizati
This is a huge topic.
Differential evolution (DEoptim package) would be one good starting
point; there is a simulated annealing method built into optim() (method
= "SANN") but it usually requires significant tuning.
Also genetic algorithms.
You could look at the NLopt list of algorit
While working on 'random walk' applications, I got interested in
optimizing noisy objective functions. As an (artificial) example, the
following is the Rosenbrock function, where Gaussian noise of standard
deviation `sd = 0.01` is added to the function value.
fn <- function(x)
(1
5 matches
Mail list logo