• English 
    • español
    • English
    • français
  • FacebookPinterestTwitter
  • español
  • English
  • français
View Item 
  •   DIGIBUG Home
  • 1.-Investigación
  • Departamentos, Grupos de Investigación e Institutos
  • Departamento de Ciencias de la Computación e Inteligencia Artificial
  • DCCIA - Artículos
  • View Item
  •   DIGIBUG Home
  • 1.-Investigación
  • Departamentos, Grupos de Investigación e Institutos
  • Departamento de Ciencias de la Computación e Inteligencia Artificial
  • DCCIA - Artículos
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks

[PDF] Artículo Principal (1.104Mb)
Identificadores
URI: https://hdl.handle.net/10481/86662
DOI: 10.1016/j.neunet.2022.10.011
Exportar
RISRefworksMendeleyBibtex
Estadísticas
View Usage Statistics
Metadata
Show full item record
Author
Poyatos Amador, Javier; Molina Cabrera, Daniel; Martínez, Aritz D.; Del Ser, Javier; Herrera Triguero, Francisco
Materia
Deep Learning
 
Evolutionary Algorithms
 
Pruning
 
Feature Selection
 
Transfer Learning
 
Date
2023-01
Referencia bibliográfica
J. Poyatos, D. Molina, A. D. Martínez, J. Del Ser, F. Herrera. EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks. Neural Networks, 158 (2023) 59-82.
Abstract
In recent years, Deep Learning models have shown a great performance in complex optimization problems. They generally require large training datasets, which is a limitation in most practical cases. Transfer learning allows importing the first layers of a pre-trained architecture and connecting them to fully-connected layers to adapt them to a new problem. Consequently, the configuration of the these layers becomes crucial for the performance of the model. Unfortunately, the optimization of these models is usually a computationally demanding task. One strategy to optimize Deep Learning models is the pruning scheme. Pruning methods are focused on reducing the complexity of the network, assuming an expected performance penalty of the model once pruned. However, the pruning could potentially be used to improve the performance, using an optimization algorithm to identify and eventually remove unnecessary connections among neurons. This work proposes EvoPruneDeepTL, an evolutionary pruning model for Transfer Learning based Deep Neural Networks which replaces the last fully-connected layers with sparse layers optimized by a genetic algorithm. Depending on its solution encoding strategy, our proposed model can either perform optimized pruning or feature selection over the densely connected part of the neural network. We carry out different experiments with several datasets to assess the benefits of our proposal. Results show the contribution of EvoPruneDeepTL and feature selection to the overall computational efficiency of the network as a result of the optimization process. In particular, the accuracy is improved, reducing at the same time the number of active neurons in the final layers.
Collections
  • DCCIA - Artículos

My Account

LoginRegister

Browse

All of DIGIBUGCommunities and CollectionsBy Issue DateAuthorsTitlesSubjectFinanciaciónAuthor profilesThis CollectionBy Issue DateAuthorsTitlesSubjectFinanciación

Statistics

View Usage Statistics

Servicios

Pasos para autoarchivoAyudaLicencias Creative CommonsSHERPA/RoMEODulcinea Biblioteca UniversitariaNos puedes encontrar a través deCondiciones legales

Contact Us | Send Feedback