TY - GEN
T1 - Towards dendrite spherical neurons for pattern classification
AU - Gómez-Flores, Wilfrido
AU - Sossa-Azuela, Juan Humberto
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2020.
PY - 2020
Y1 - 2020
N2 - This paper introduces the Dendrite Spherical Neuron (DSN) as an alternative to the Dendrite Ellipsoidal Neuron (DEN), in which hyperspheres group the patterns from different classes instead of hyperellipses. The reasoning behind DSN is simplifying the computation of DEN architecture, where a centroid and covariance matrix are two dendritic parameters, whereas, in DSN, the covariance matrix is replaced by a radius. This modification is useful to avoid singular covariance matrices since DEN requires measuring the Mahalanobis distance to classify patterns. The DSN training consists of determining the centroids of dendrites with the k-means algorithm, followed by calculating the radius of dendrites as the mean distance to the two nearest centroids, and finally determining the weights of a softmax function, with Stochastic Gradient Descent, at the output of the neuron. Besides, the Simulated Annealing automatically determines the number of dendrites that maximizes the classification accuracy. The DSN is applied to synthetic and real-world datasets. The experimental results reveal that DSN is competitive with Multilayer Perceptron (MLP) networks, with less complex architectures. Also, DSN tends to outperform the Dendrite Morphological Neuron (DMN), which uses hyperboxes. These findings suggest that the DSN is a potential alternative to MLP and DMN for pattern classification tasks.
AB - This paper introduces the Dendrite Spherical Neuron (DSN) as an alternative to the Dendrite Ellipsoidal Neuron (DEN), in which hyperspheres group the patterns from different classes instead of hyperellipses. The reasoning behind DSN is simplifying the computation of DEN architecture, where a centroid and covariance matrix are two dendritic parameters, whereas, in DSN, the covariance matrix is replaced by a radius. This modification is useful to avoid singular covariance matrices since DEN requires measuring the Mahalanobis distance to classify patterns. The DSN training consists of determining the centroids of dendrites with the k-means algorithm, followed by calculating the radius of dendrites as the mean distance to the two nearest centroids, and finally determining the weights of a softmax function, with Stochastic Gradient Descent, at the output of the neuron. Besides, the Simulated Annealing automatically determines the number of dendrites that maximizes the classification accuracy. The DSN is applied to synthetic and real-world datasets. The experimental results reveal that DSN is competitive with Multilayer Perceptron (MLP) networks, with less complex architectures. Also, DSN tends to outperform the Dendrite Morphological Neuron (DMN), which uses hyperboxes. These findings suggest that the DSN is a potential alternative to MLP and DMN for pattern classification tasks.
KW - Dendrite Morphological Neuron
KW - Pattern classification
KW - Simulated Annealing
KW - Spherical dendrite
UR - http://www.scopus.com/inward/record.url?scp=85087278620&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-49076-8_2
DO - 10.1007/978-3-030-49076-8_2
M3 - Contribución a la conferencia
SN - 9783030490751
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 14
EP - 24
BT - Pattern Recognition - 12th Mexican Conference, MCPR 2020, Proceedings
A2 - Figueroa Mora, Karina Mariela
A2 - Anzurez Marín, Juan
A2 - Cerda, Jaime
A2 - Carrasco-Ochoa, Jesús Ariel
A2 - Martínez-Trinidad, José Francisco
A2 - Olvera-López, José Arturo
PB - Springer
T2 - 12th Mexican Conference on Pattern Recognition, MCPR 2020
Y2 - 24 June 2020 through 27 June 2020
ER -