Mostrar el registro sencillo del ítem

dc.contributor.authorBaños Legrán, Oresti 
dc.contributor.authorDamas Hermoso, Miguel 
dc.contributor.authorPomares Cintas, Héctor Emilio 
dc.contributor.authorRojas Ruiz, Ignacio 
dc.date.accessioned2013-10-18T09:44:11Z
dc.date.available2013-10-18T09:44:11Z
dc.date.issued2012
dc.identifier.citationBaños, O.; et al. On the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognition. Sensors, 12(6): 8039-8054 (2012). [http://hdl.handle.net/10481/28450]es_ES
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/10481/28450
dc.descriptionThis article belongs to the Special Issue Select papers from UCAmI 2011 - the 5th International Symposium on Ubiquitous Computing and Ambient Intelligence (UCAmI'11).es_ES
dc.description.abstractThe main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise) imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.es_ES
dc.description.sponsorshipThis work was supported in part by the Spanish CICYT Project TIN2007-60587, Junta de Andalucia Projects P07-TIC-02768 and P07-TIC-02906, the CENIT project AmIVital, of the “Centro para el Desarrollo Tecnol´ogico Industrial” (CDTI- Spain), the FPU Spanish grant AP2009-2244 and the UGR Spanish grant “Iniciaci´on a la Investigaci´on 2010/2011”.es_ES
dc.language.isoenges_ES
dc.publisherMDPIes_ES
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivs 3.0 Licensees_ES
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es_ES
dc.subjectActivity recognitiones_ES
dc.subjectRotational noisees_ES
dc.subjectAdditive noisees_ES
dc.subjectMetaclassifieres_ES
dc.subjectAccelerometeres_ES
dc.subjectSensor fusiones_ES
dc.titleOn the Use of Sensor Fusion to Reduce the Impact of Rotational and Additive Noise in Human Activity Recognitiones_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.identifier.doi10.3390/s120608039es_ES


Ficheros en el ítem

[PDF]

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License
Excepto si se señala otra cosa, la licencia del ítem se describe como Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License