Afficher la notice abrégée

dc.contributor.authorBaños Legrán, Oresti 
dc.contributor.authorCalatroni, Alberto
dc.contributor.authorDamas Hermoso, Miguel 
dc.contributor.authorPomares Cintas, Héctor Emilio 
dc.contributor.authorRoggen, Daniel
dc.contributor.authorRojas Ruiz, Ignacio 
dc.contributor.authorVillalonga Palliser, Claudia 
dc.date.accessioned2026-01-28T11:18:30Z
dc.date.available2026-01-28T11:18:30Z
dc.date.issued2021-03-31
dc.identifier.citationBaños, O., Calatroni, A., Damas, M. et al. Opportunistic Activity Recognition in IoT Sensor Ecosystems via Multimodal Transfer Learning. Neural Process Lett 53, 3169–3197 (2021). https://doi.org/10.1007/s11063-021-10468-zes_ES
dc.identifier.urihttps://hdl.handle.net/10481/110419
dc.descriptionThis work has been partially supported by the Spanish Ministry of Science, Innovation and Universities (MICINN) Projects PGC2018-098813-B-C31 and RTI2018-101674-B-I00 together with the European Fund for Regional Development (FEDER).es_ES
dc.description.abstractRecognizing human activities seamlessly and ubiquitously is now closer than ever given the myriad of sensors readily deployed on and around users. However, the training of recognition systems continues to be both time and resource-consuming, as datasets must be collected ad-hoc for each specific sensor setup a person may encounter in their daily life. This work presents an alternate approach based on transfer learning to opportunistically train new unseen or target sensor systems from existing or source sensor systems. The approach uses system identification techniques to learn a mapping function that automatically translates the signals from the source sensor domain to the target sensor domain, and vice versa. This can be done for sensor signals of the same or cross modality. Two transfer models are proposed to translate recognition systems based on either activity templates or activity models, depending on the characteristics of both source and target sensor systems. The proposed transfer methods are evaluated in a human–computer interaction scenario, where the transfer is performed in between wearable sensors placed at different body locations, and in between wearable sensors and an ambient depth camera sensor. Results show that a good transfer is possible with just a few seconds of data, irrespective of the direction of the transfer and for similar and cross sensor modalitieses_ES
dc.description.sponsorshipSpanish Ministry of Science, Innovation and Universities (MICINN) PGC2018-098813-B-C31 and RTI2018-101674-B-I00es_ES
dc.description.sponsorshipEuropean Fund for Regional Development (FEDER)es_ES
dc.language.isoenges_ES
dc.publisherSpringer Naturees_ES
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivs 3.0 Licensees_ES
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es_ES
dc.subjectTransfer learninges_ES
dc.subjectMultimodal sensorses_ES
dc.subjectWearable sensorses_ES
dc.subjectAmbient sensorses_ES
dc.subjectActivity recognitiones_ES
dc.subjectHuman–computer Interactiones_ES
dc.titleOpportunistic Activity Recognition in IoT Sensor Ecosystems via Multimodal Transfer Learninges_ES
dc.typejournal articlees_ES
dc.rights.accessRightsopen accesses_ES
dc.identifier.doi10.1007/s11063-021-10468-z
dc.type.hasVersionVoRes_ES


Fichier(s) constituant ce document

[PDF]

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License
Excepté là où spécifié autrement, la license de ce document est décrite en tant que Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License