TY - GEN
T1 - Forward-backward visual saliency propagation in Deep NNs vs internal attentional mechanisms
AU - Obeso, Abraham Montoya
AU - Benois-Pineau, Jenny
AU - Vazquez, Mireya Sarai Garcia
AU - Acosta, Alejandro Alvaro Ramirez
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - Attention models in deep learning algorithms gained popularity in recent years. In this work, we propose an attention mechanism on the basis of visual saliency maps injected into the Deep Neural Network (DNN) to enhance regions in feature maps during forward-backward propagation in training, and only forward propagation in testing. The key idea is to spatially capture features associated to prominent regions in images and propagate them to deeper layers. During training, first, we take as backbone the well-known AlexNet architecture and then the ResNet architecture to solve the task of building identification of Mexican architecture. Our model equipped with the "external" visual saliency-based attention mechanism outperforms models armed with squeeze-and-excitation units and double-attention blocks.
AB - Attention models in deep learning algorithms gained popularity in recent years. In this work, we propose an attention mechanism on the basis of visual saliency maps injected into the Deep Neural Network (DNN) to enhance regions in feature maps during forward-backward propagation in training, and only forward propagation in testing. The key idea is to spatially capture features associated to prominent regions in images and propagate them to deeper layers. During training, first, we take as backbone the well-known AlexNet architecture and then the ResNet architecture to solve the task of building identification of Mexican architecture. Our model equipped with the "external" visual saliency-based attention mechanism outperforms models armed with squeeze-and-excitation units and double-attention blocks.
KW - Neural Networks
KW - Saliency Maps
KW - Visual Attention
UR - http://www.scopus.com/inward/record.url?scp=85077953240&partnerID=8YFLogxK
U2 - 10.1109/IPTA.2019.8936125
DO - 10.1109/IPTA.2019.8936125
M3 - Contribución a la conferencia
AN - SCOPUS:85077953240
T3 - 2019 9th International Conference on Image Processing Theory, Tools and Applications, IPTA 2019
BT - 2019 9th International Conference on Image Processing Theory, Tools and Applications, IPTA 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th International Conference on Image Processing Theory, Tools and Applications, IPTA 2019
Y2 - 6 November 2019 through 9 November 2019
ER -