TY - JOUR
T1 - Equivalent neural network optimal coefficients using forgetting factor with sliding modes
AU - Aguilar Cruz, Karen Alicia
AU - Medel Juárez, José De Jesús
AU - Urbieta Parrazales, Romeo
N1 - Publisher Copyright:
© 2016 Karen Alicia Aguilar Cruz et al.
PY - 2016
Y1 - 2016
N2 - The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-To-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied.
AB - The Artificial Neural Network (ANN) concept is familiar in methods whose task is, for example, the identification or approximation of the outputs of complex systems difficult to model. In general, the objective is to determine online the adequate parameters to reach a better point-To-point convergence rate, so that this paper presents the parameter estimation for an equivalent ANN (EANN), obtaining a recursive identification for a stochastic system, firstly, with constant parameters and, secondly, with nonstationary output system conditions. Therefore, in the last estimation, the parameters also have stochastic properties, making the traditional approximation methods not adequate due to their losing of convergence rate. In order to give a solution to this problematic, we propose a nonconstant exponential forgetting factor (NCEFF) with sliding modes, obtaining in almost all points an exponential convergence rate decreasing. Theoretical results of both identification stages are performed using MATLAB® and compared, observing improvement when the new proposal for nonstationary output conditions is applied.
UR - http://www.scopus.com/inward/record.url?scp=85008938937&partnerID=8YFLogxK
U2 - 10.1155/2016/4642052
DO - 10.1155/2016/4642052
M3 - Artículo
C2 - 28058045
AN - SCOPUS:85008938937
SN - 1687-5265
VL - 2016
JO - Computational Intelligence and Neuroscience
JF - Computational Intelligence and Neuroscience
M1 - 4642052
ER -