Opportunistic Activity Recognition in IoT Sensor Ecosystems via Multimodal Transfer Learning
Metadatos
Mostrar el registro completo del ítemAutor
Baños Legrán, Oresti; Calatroni, Alberto; Damas Hermoso, Miguel; Pomares Cintas, Héctor Emilio; Roggen, Daniel; Rojas Ruiz, Ignacio; Villalonga Palliser, ClaudiaEditorial
Springer Nature
Materia
Transfer learning Multimodal sensors Wearable sensors Ambient sensors Activity recognition Human–computer Interaction
Fecha
2021-03-31Referencia bibliográfica
Baños, O., Calatroni, A., Damas, M. et al. Opportunistic Activity Recognition in IoT Sensor Ecosystems via Multimodal Transfer Learning. Neural Process Lett 53, 3169–3197 (2021). https://doi.org/10.1007/s11063-021-10468-z
Patrocinador
Spanish Ministry of Science, Innovation and Universities (MICINN) PGC2018-098813-B-C31 and RTI2018-101674-B-I00; European Fund for Regional Development (FEDER)Resumen
Recognizing human activities seamlessly and ubiquitously is now closer than ever given the myriad of sensors readily deployed on and around users. However, the training of recognition systems continues to be both time and resource-consuming, as datasets must be collected ad-hoc for each specific sensor setup a person may encounter in their daily life. This work presents an alternate approach based on transfer learning to opportunistically train new unseen or target sensor systems from existing or source sensor systems. The approach uses system identification techniques to learn a mapping function that automatically translates the signals
from the source sensor domain to the target sensor domain, and vice versa. This can be done for sensor signals of the same or cross modality. Two transfer models are proposed to translate recognition systems based on either activity templates or activity models, depending on the characteristics of both source and target sensor systems. The proposed transfer methods are evaluated in a human–computer interaction scenario, where the transfer is performed in between wearable sensors placed at different body locations, and in between wearable sensors and an ambient depth camera sensor. Results show that a good transfer is possible
with just a few seconds of data, irrespective of the direction of the transfer and for similar and cross sensor modalities





