Mostrar el registro sencillo del ítem

dc.contributor.authorAquino Brítez, Sergio
dc.contributor.authorGarcía Sánchez, Pablo 
dc.contributor.authorOrtiz, Andrés
dc.contributor.authorAquino Brítez, Diego
dc.date.accessioned2025-02-27T07:22:50Z
dc.date.available2025-02-27T07:22:50Z
dc.date.issued2025-01-30
dc.identifier.citationAquino-Brítez, S.; García-Sánchez, P.; Ortiz, A.; Aquino-Brítez, D. Towards an Energy Consumption Index for Deep Learning Models: A Comparative Analysis of Architectures, GPUs, andMeasurement Tools. Sensors 2025, 25, 846. https:// doi.org/10.3390/s25030846es_ES
dc.identifier.urihttps://hdl.handle.net/10481/102752
dc.description.abstractThe growing global demand for computational resources, particularly in Artificial Intelligence (AI) applications, raises increasing concerns about energy consumption and its environmental impact. This study introduces a newly developed energy consumption index that evaluates the energy efficiency of Deep Learning (DL) models, providing a standardized and adaptable approach for various models. Convolutional neural networks, including both classical and modern architectures, serve as the primary case study to demonstrate the applicability of the index. Furthermore, the inclusion of the Swin Transformer, a state-of-the-art and modern non-convolutional model, highlights the adaptability of the framework to diverse architectural paradigms. This study analyzes the energy consumption during both training and inference of representative DL architectures, including AlexNet, ResNet18, VGG16, EfficientNet-B3, ConvNeXt-T, and Swin Transformer, trained on the Imagenette dataset using TITAN XP and GTX 1080 GPUs. Energy measurements are obtained using sensor-based tools, including OpenZmeter (v2) with integrated electrical sensors. Additionally, software-based tools such as CarbonTracker (v1.2.5) and CodeCarbon (v2.4.1) retrieve energy consumption data from computational component sensors. The results reveal significant differences in energy efficiency across architectures and GPUs, providing insights into the trade-offs between model performance and energy use. By offering a flexible framework for comparing energy efficiency across DL models, this study advances sustainability in AI systems, supporting accurate and standardized energy evaluations applicable to various computational settings.es_ES
dc.description.sponsorshipPID2023-147409NB-C21. MICIU/AEI/10.13039/501100011033 and by ERDF/EU. Ministerio Español de Ciencia e Innovaciónes_ES
dc.description.sponsorshipPID2020-115570GB-C22. MICIU/AEI/10.13039/501100011033 and by ERDF/EU. Ministerio Español de Ciencia e Innovaciónes_ES
dc.description.sponsorshipPID2022-137461NB-C32. MICIU/AEI/10.13039/501100011033 and by ERDF/EU. Ministerio Español de Ciencia e Innovaciónes_ES
dc.description.sponsorshipTIC251-G-FEDER. ERDF/EUes_ES
dc.description.sponsorshipC-ING-027-UGR23. ERDF/EUes_ES
dc.language.isoenges_ES
dc.publisherMDPIes_ES
dc.rightsAtribución 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subjectgreen computinges_ES
dc.subjectenergy efficiencyes_ES
dc.subjectmachine learninges_ES
dc.titleTowards an energy consumption index for deep learning models: a comparative analysis of architectures, GPUs, and measurement Toolses_ES
dc.typejournal articlees_ES
dc.rights.accessRightsopen accesses_ES
dc.identifier.doi10.3390/s25030846
dc.type.hasVersionVoRes_ES


Ficheros en el ítem

[PDF]

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Atribución 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Atribución 4.0 Internacional