TY - CHAP
T1 - The gradient subspace approximation and its application to bi-objective optimization problems
AU - Schütze, Oliver
AU - Uribe, Lourdes
AU - Lara, Adriana
N1 - Publisher Copyright:
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2020.
PY - 2020
Y1 - 2020
N2 - Evolutionary algorithms are very popular and are frequently applied to many different optimization problems. Reasons for this success include that methods of this kind are of global nature, very robust, and only require minimal assumptions on the optimization problem. It is also known that such methods need quite a few resources to generate accurate approximations of the solution sets. As a remedy, researchers have used hybrid (or memetic) algorithms, i.e., evolutionary algorithms coupled with local search for which mainly techniques from mathematical programming are utilized. Such hybrids typically yield satisfying results, the problem, however, remains that the algorithms are relatively expensive since the gradients have to be computed or approximated at each given candidate solution that is designated for local search. In this chapter, we review the Gradient Subspace Approximation (GSA) which allows to compute a descent direction in a best fit manner from given neighborhood information that is e.g. already given in evolutionary algorithms. The computation of such directions comes hence for free in terms of additional function evaluations of the given problem which opens the door for the realization of low-cost local search engines within evolutionary algorithms. In a next step, we show how GSA can be applied to the context of bi-objective optimization. Finally, to demonstrate the benefit of the method we present some results on a hybrid that is based on the evolutionary algorithm NSGA-II.
AB - Evolutionary algorithms are very popular and are frequently applied to many different optimization problems. Reasons for this success include that methods of this kind are of global nature, very robust, and only require minimal assumptions on the optimization problem. It is also known that such methods need quite a few resources to generate accurate approximations of the solution sets. As a remedy, researchers have used hybrid (or memetic) algorithms, i.e., evolutionary algorithms coupled with local search for which mainly techniques from mathematical programming are utilized. Such hybrids typically yield satisfying results, the problem, however, remains that the algorithms are relatively expensive since the gradients have to be computed or approximated at each given candidate solution that is designated for local search. In this chapter, we review the Gradient Subspace Approximation (GSA) which allows to compute a descent direction in a best fit manner from given neighborhood information that is e.g. already given in evolutionary algorithms. The computation of such directions comes hence for free in terms of additional function evaluations of the given problem which opens the door for the realization of low-cost local search engines within evolutionary algorithms. In a next step, we show how GSA can be applied to the context of bi-objective optimization. Finally, to demonstrate the benefit of the method we present some results on a hybrid that is based on the evolutionary algorithm NSGA-II.
KW - Bi-objective optimization
KW - Descent directions
KW - Gradient Subspace Approximation
KW - Gradient free optimization
UR - http://www.scopus.com/inward/record.url?scp=85090259696&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-51264-4_15
DO - 10.1007/978-3-030-51264-4_15
M3 - Capítulo
AN - SCOPUS:85090259696
T3 - Studies in Systems, Decision and Control
SP - 355
EP - 390
BT - Studies in Systems, Decision and Control
PB - Springer
ER -