Chest x-ray classification using transfer learning on multi-GPU

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Since the first quarter of this year, the spread of SARS-CoV-19 virus has been a worldwide health priority. Medical testing consists of Lab studies, PCR tests, CT, PET, which are time-consuming, some countries lack these resources. One medical tool for diagnosis is X-Ray imaging, which is one of the fastest and low-cost resources for physicians to detect and to distinguish among these different diseases. We propose an X-Ray CAD system based on DCNN, using well-known architectures such as DenseNet-201, ResNet-50 and EfficientNet. These architectures are pre-trained on data from Imagenet classification challenge, moreover, using Transfer Learning methods to Fine-Tune the classification stage. The system is capable to visualize the learned recognition patterns applying the GRAD-CAM algorithm aiming to help physicians in seeking hidden features from perceptual vision. The proposed CAD can differentiate between COVID-19, Pneumonia, Nodules and Normal lung X-Ray images.

Original languageEnglish
Title of host publicationReal-Time Image Processing and Deep Learning 2021
EditorsNasser Kehtarnavaz, Matthias F. Carlsohn
PublisherSPIE
ISBN (Electronic)9781510643093
DOIs
StatePublished - 2021
EventReal-Time Image Processing and Deep Learning 2021 - Virtual, Online, United States
Duration: 12 Apr 202116 Apr 2021

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume11736
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

ConferenceReal-Time Image Processing and Deep Learning 2021
Country/TerritoryUnited States
CityVirtual, Online
Period12/04/2116/04/21

Keywords

  • CNN
  • COVID-19
  • Classification
  • Deep Learning
  • Multi-GPU
  • X-Ray

Fingerprint

Dive into the research topics of 'Chest x-ray classification using transfer learning on multi-GPU'. Together they form a unique fingerprint.

Cite this