Semi-Supervised Learning for Image Classification using Compact Networks in the BioMedical Context

  1. Inés, Adrián 1
  2. Díaz-Pinto, Andrés 2
  3. Domínguez, César 1
  4. Heras, Jónathan 1
  5. Mata, Eloy 1
  6. Pascual, Vico 1
  1. 1 Universidad de La Rioja
    info

    Universidad de La Rioja

    Logroño, España

    ROR https://ror.org/0553yr311

  2. 2 King's College School
    info

    King's College School

    Londres, Reino Unido

    ROR https://ror.org/02bbqcn27

Revista:
[depositado en Arxiv]

Any de publicació: 2022

Tipus: Document de treball

beta Ver similares en nube de resultados
Repositori institucional: lock_openAccés obert Editor

Resum

Background and objectives: The development of mobile and on the edge appli-cations that embed deep convolutional neural models has the potential to revolutionisebiomedicine. However, most deep learning models require computational resourcesthat are not available in smartphones or edge devices; an issue that can be faced bymeans of compact models. The problem with such models is that they are, at leastusually, less accurate than bigger models. In this work, we study how this limitationcan be addressed with the application of semi-supervised learning techniques.Methods: We conduct several statistical analyses to compare performance of deepcompact architectures when trained using semi-supervised learning methods for tack-ling image classification tasks in the biomedical context. In particular, we explore threefamilies of compact networks, and two families of semi-supervised learning techniquesfor 10 biomedical tasks.Results: By combining semi-supervised learning methods with compact net-works, it is possible to obtain a similar performance to standard size networks. Ingeneral, the best results are obtained when combining data distillation with MixNet,and plain distillation with ResNet-18. Also, in general, NAS networks obtain betterresults than manually designed networks and quantized networks.Conclusions: The work presented in this paper shows the benefits of apply semi-supervised methods to compact networks; this allow us to create compact models thatare not only as accurate as standard size models, but also faster and lighter. Finally,we have developed a library that simplifies the construction of compact models usingsemi-supervised learning methods