TY - JOUR
T1 - Stable learning laws design for long short-term memory identifier for uncertain discrete systems via control Lyapunov functions
AU - Guarneros-Sandoval, Alejandro
AU - Ballesteros, Mariana
AU - Salgado, Ivan
AU - Chairez, Isaac
N1 - Publisher Copyright:
© 2022 Elsevier B.V.
PY - 2022/6/28
Y1 - 2022/6/28
N2 - This study introduces a method for designing stable learning laws of Long Short-Term Memory (LSTM) networks working as a non-parametric identifier of nonlinear systems with uncertain models. The strategy applies the concept of stability for discrete-time systems in the sense of Lyapunov to prove that origin is a practical stable equilibrium point for the identification error. The laws consider a general class of sigmoidal functions placed at the different gates of a LSTM structure (long and short memory). The design of the learning laws uses a matrix inequality framework to obtain the rate gains associated with the evolution of the weights. Numerical results show the designed learning laws for the non-parametric identifier based on a LSTM approximation tested on two classes of nonlinear systems: the first one describes the ozone-based degradation of organic contaminants, and the second one represents the dynamics of a Van Der Poll oscillator. The LSTM identifier is compared against a classical Lyapunov-based recurrent neural network. This comparison demonstrates how the proposed algorithm approximates the trajectories of both systems with a smaller mean squared error, which serves as an indicator of the benefits obtained with these new learning laws.
AB - This study introduces a method for designing stable learning laws of Long Short-Term Memory (LSTM) networks working as a non-parametric identifier of nonlinear systems with uncertain models. The strategy applies the concept of stability for discrete-time systems in the sense of Lyapunov to prove that origin is a practical stable equilibrium point for the identification error. The laws consider a general class of sigmoidal functions placed at the different gates of a LSTM structure (long and short memory). The design of the learning laws uses a matrix inequality framework to obtain the rate gains associated with the evolution of the weights. Numerical results show the designed learning laws for the non-parametric identifier based on a LSTM approximation tested on two classes of nonlinear systems: the first one describes the ozone-based degradation of organic contaminants, and the second one represents the dynamics of a Van Der Poll oscillator. The LSTM identifier is compared against a classical Lyapunov-based recurrent neural network. This comparison demonstrates how the proposed algorithm approximates the trajectories of both systems with a smaller mean squared error, which serves as an indicator of the benefits obtained with these new learning laws.
KW - Controlled Lyapunov function
KW - Long short term memory
KW - Lyapunov stability
KW - Non-parametric identifier
KW - Stable learning laws
UR - http://www.scopus.com/inward/record.url?scp=85128290017&partnerID=8YFLogxK
U2 - 10.1016/j.neucom.2022.03.070
DO - 10.1016/j.neucom.2022.03.070
M3 - Artículo
AN - SCOPUS:85128290017
SN - 0925-2312
VL - 491
SP - 144
EP - 159
JO - Neurocomputing
JF - Neurocomputing
ER -