TY - JOUR
T1 - Smooth dendrite morphological neurons
AU - Gómez-Flores, Wilfrido
AU - Sossa, Humberto
N1 - Publisher Copyright:
© 2021 Elsevier Ltd
PY - 2021/4
Y1 - 2021/4
N2 - A typical feature of hyperbox-based dendrite morphological neurons (DMN) is the generation of sharp and rough decision boundaries that inaccurately track the distribution shape of classes of patterns. This feature is because the minimum and maximum activation functions force the decision boundaries to match the faces of the hyperboxes. To improve the DMN response, we introduce a dendritic model that uses smooth maximum and minimum functions to soften the decision boundaries. The classification performance assessment is conducted on nine synthetic and 28 real-world datasets. Based on the experimental results, we demonstrate that the smooth activation functions improve the generalization capacity of DMN. The proposed approach is competitive with four machine learning techniques, namely, Multilayer Perceptron, Radial Basis Function Network, Support Vector Machine, and Nearest Neighbor algorithm. Besides, the computational complexity of DMN training is lower than MLP and SVM classifiers.
AB - A typical feature of hyperbox-based dendrite morphological neurons (DMN) is the generation of sharp and rough decision boundaries that inaccurately track the distribution shape of classes of patterns. This feature is because the minimum and maximum activation functions force the decision boundaries to match the faces of the hyperboxes. To improve the DMN response, we introduce a dendritic model that uses smooth maximum and minimum functions to soften the decision boundaries. The classification performance assessment is conducted on nine synthetic and 28 real-world datasets. Based on the experimental results, we demonstrate that the smooth activation functions improve the generalization capacity of DMN. The proposed approach is competitive with four machine learning techniques, namely, Multilayer Perceptron, Radial Basis Function Network, Support Vector Machine, and Nearest Neighbor algorithm. Besides, the computational complexity of DMN training is lower than MLP and SVM classifiers.
KW - Dendrite processing
KW - Hyperbox-shaped dendrite
KW - Morphological neurons
KW - Neural networks
KW - Smooth activation functions
UR - http://www.scopus.com/inward/record.url?scp=85099217252&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2020.12.021
DO - 10.1016/j.neunet.2020.12.021
M3 - Artículo
C2 - 33445004
AN - SCOPUS:85099217252
SN - 0893-6080
VL - 136
SP - 40
EP - 53
JO - Neural Networks
JF - Neural Networks
ER -