TY - GEN
T1 - Toward a new family of hybrid evolutionary algorithms
AU - Uribe, Lourdes
AU - Schütze, Oliver
AU - Lara, Adriana
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - Multi-objective optimization problems (MOPs) arise in a natural way in diverse knowledge areas. Multi-objective evolutionary algorithms (MOEAs) have been applied successfully to solve this type of optimization problems over the last two decades. However, until now MOEAs need quite a few resources in order to obtain acceptable Pareto set/front approximations. Even more, in certain cases when the search space is highly constrained, MOEAs may have troubles when approximating the solution set. When dealing with constrained MOPs (CMOPs), MOEAs usually apply penalization methods. One possibility to overcome these situations is the hybridization of MOEAs with local search operators. If the local search operator is based on classical mathematical programming, gradient information is used, leading to a relatively high computational cost. In this work, we give an overview of our recently proposed constraint handling methods and their corresponding hybrid algorithms. These methods have specific mechanisms that deal with the constraints in a wiser way without increasing their cost. Both methods do not explicitly compute the gradients but extract this information in the best manner out of the current population of the MOEAs. We conjecture that these techniques will allow for the fast and reliable treatment of CMOPs in the near future. Numerical results indicate that these ideas already yield competitive results in many cases.
AB - Multi-objective optimization problems (MOPs) arise in a natural way in diverse knowledge areas. Multi-objective evolutionary algorithms (MOEAs) have been applied successfully to solve this type of optimization problems over the last two decades. However, until now MOEAs need quite a few resources in order to obtain acceptable Pareto set/front approximations. Even more, in certain cases when the search space is highly constrained, MOEAs may have troubles when approximating the solution set. When dealing with constrained MOPs (CMOPs), MOEAs usually apply penalization methods. One possibility to overcome these situations is the hybridization of MOEAs with local search operators. If the local search operator is based on classical mathematical programming, gradient information is used, leading to a relatively high computational cost. In this work, we give an overview of our recently proposed constraint handling methods and their corresponding hybrid algorithms. These methods have specific mechanisms that deal with the constraints in a wiser way without increasing their cost. Both methods do not explicitly compute the gradients but extract this information in the best manner out of the current population of the MOEAs. We conjecture that these techniques will allow for the fast and reliable treatment of CMOPs in the near future. Numerical results indicate that these ideas already yield competitive results in many cases.
KW - Evolutionary computation
KW - Hybrid meta-heuristics
KW - Mathematical programming
KW - Multi-objective optimization
UR - http://www.scopus.com/inward/record.url?scp=85063056467&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-12598-1_7
DO - 10.1007/978-3-030-12598-1_7
M3 - Contribución a la conferencia
SN - 9783030125974
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 78
EP - 90
BT - Evolutionary Multi-Criterion Optimization - 10th International Conference, EMO 2019, Proceedings
A2 - Deb, Kalyanmoy
A2 - Goodman, Erik
A2 - Miettinen, Kaisa
A2 - Coello Coello, Carlos A.
A2 - Klamroth, Kathrin
A2 - Mostaghim, Sanaz
A2 - Reed, Patrick
PB - Springer Verlag
T2 - 10th International Conference on Evolutionary Multi-Criterion Optimization, EMO 2019
Y2 - 10 March 2019 through 13 March 2019
ER -