Dropping Activations in Convolutional Neural Networks with Visual Attention Maps

Abraham Montoya Obeso, Jenny Benois-Pineau, Mireya Sarai Garcia Vazquez, Alejandro A.Ramirez Acosta

Producción científica: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

2 Citas (Scopus)

Resumen

The introduction of visual attention models in data selection and features selection in CNNs for the task of image classification is an intensive and interesting research topic. In CNNs, the strategy of dropping activations, after features extraction layers, shown an increase in the generalization gap in large-scale datasets and avoiding over-fitting. Dropout has been studied in the literature in a fully-randomized manner to take down activations during training. In this paper, we introduce a saliency-based dropping strategy to take down activations in our AlexNet-like architecture. Our experiments are conducted for the specific task of specific Mexican architectural recognition, in 67 categories. The results are promising: The proposed approach outperformed other models reducing training time and reaching a higher accuracy.

Idioma originalInglés
Título de la publicación alojada2019 International Conference on Content-Based Multimedia Indexing, CBMI 2019 - Proceedings
EditorialIEEE Computer Society
ISBN (versión digital)9781728146737
DOI
EstadoPublicada - sep. 2019
Evento17th International Conference on Content-Based Multimedia Indexing, CBMI 2019 - Dublin, Irlanda
Duración: 4 sep. 20196 sep. 2019

Serie de la publicación

NombreProceedings - International Workshop on Content-Based Multimedia Indexing
Volumen2019-September
ISSN (versión impresa)1949-3991

Conferencia

Conferencia17th International Conference on Content-Based Multimedia Indexing, CBMI 2019
País/TerritorioIrlanda
CiudadDublin
Período4/09/196/09/19

Huella

Profundice en los temas de investigación de 'Dropping Activations in Convolutional Neural Networks with Visual Attention Maps'. En conjunto forman una huella única.

Citar esto