doi: 10.1685/journal.caim.393

A multistart gradient-based algorithm with surrogate model for global optimization

Daniele Peri, Federica Tinti

Abstract


Gradient-based optimization algorithms are probably the most efficient option for the solution of a local optimization problem. These methods are intrinsically limited in the search of a local optimum of the objective function: if a global optimum is searched, the application of local optimization algorithms can be still successful if the algorithm is initialized starting from a large number of different points in the design space (multistart algorithms). As a counterpart, the cost of the exploration is further increased, linearly with the number of adopted starting points. Consequently, the use of a multistart local optimization algorithm is rarely adopted, mainly for two reasons: i) the large computa- tional cost and ii) the absence of a guarantee about the success of the search (in fact, there is not a general indication about the minimum number of starting points able to guarantee the success of global optimization).
In this paper, techniques for reducing the computational cost of the full process together with some techniques able to maximize the efficiency of a parallel multistart search are described and tested. An extensive use of surrogate models is applied in order to drastically reduce the computational effort in practical applications, where the computational cost of a single objective function evaluation is high. Declustering techniques are also adopted in order to exploit at best the different local searches.

Full Text: PDF

Refbacks

  • There are currently no refbacks.


Creative Commons License

This work is licensed under a Creative Commons Attribution NonCommercial NoDerivs 3.0 License.

Communications in Applied and Industrial Mathematics
ISSN: 2038-0909