Mostrar el registro sencillo del ítem

dc.contributor.authorGarcia-Gonzalez, D.*
dc.contributor.authorRivero Cebrián, Daniel*
dc.contributor.authorFernández Blanco, Enrique*
dc.contributor.authorLuaces, M.R.*
dc.date.accessioned2025-09-08T12:23:54Z
dc.date.available2025-09-08T12:23:54Z
dc.date.issued2023
dc.identifier.citationGarcia-Gonzalez D, Rivero D, Fernandez-Blanco E, Luaces MR. Deep learning models for real-life human activity recognition from smartphone sensor data. Internet of Things (Netherlands). 2023;24.
dc.identifier.issn2542-6605
dc.identifier.otherhttps://portalcientifico.sergas.gal//documentos/651014e20058624993e2d58b
dc.identifier.urihttp://hdl.handle.net/20.500.11940/21312
dc.description.abstractNowadays, the field of human activity recognition (HAR) is a remarkably hot topic within the scientific community. Given the low cost, ease of use and high accuracy of the sensors from different wearable devices and smartphones, more and more researchers are opting to do their bit in this area. However, until very recently, all the work carried out in this field was done in laboratory conditions, with very few similarities with our daily lives. This paper will focus on this new trend of integrating all the knowledge acquired so far into a real-life environment. Thus, a dataset already published following this philosophy was used. In this way, this work aims to be able to identify the different actions studied there. In order to perform this classification, this paper explores new designs and architectures for models inspired by the ones which have yielded the best results in the literature. More specifically, different configurations of Convolutional Neural Networks (CNN) and Long-Short Term Memory (LSTM) have been tested, but on real-life conditions instead of laboratory ones. It is worth mentioning that the hybrid models formed from these techniques yielded the best results, with a peak accuracy of 94.80% on the dataset used.
dc.description.sponsorshipThis research was partially funded by MCIN/AEI/10.13039/501100011033, Next GenerationEU/PRTR, FLATCITY-POC, Spain [grant number PDC2021-121239-C31] ; MCIN/AEI/10.13039/501100011033 MAGIST, Spain [grant number PID2019-105221RB-C41] ; Xunta de Galicia/FEDER-UE, Spain [grant numbers ED431G 2019/01, ED481A 2020/003, ED431C 2022/46, ED431C 2018/49 and ED431C 2021/53] . Funidng for open access charge: Universidade da Coruna/CISUG.
dc.languageeng
dc.rightsAttribution 4.0 International (CC BY 4.0)*
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.titleDeep learning models for real-life human activity recognition from smartphone sensor data
dc.typeArtigo
dc.authorsophosGarcia-Gonzalez, D.; Rivero, D.; Fernandez-Blanco, E.; Luaces, M.R.
dc.identifier.doi10.1016/j.iot.2023.100925
dc.identifier.sophos651014e20058624993e2d58b
dc.journal.titleInternet of Things (Netherlands)*
dc.organizationInstituto de Investigación Biomédica de A Coruña (INIBIC)
dc.organizationInstituto de Investigación Biomédica de A Coruña (INIBIC)
dc.relation.projectIDMCIN/AEI [ED481A 2020/003, ED431C 2022/46, ED431C 2018/49, ED431C 2021/53]
dc.relation.projectIDNext GenerationEU/PRTR
dc.relation.projectIDFLATCITY-POC, Spain [PDC2021-121239-C31]
dc.relation.projectIDMCIN/AEI, Spain [PID2019-105221RB-C41]
dc.relation.projectIDXunta de Galicia/FEDER-UE, Spain [ED481A 2020/003, ED431C 2022/46, ED431C 2018/49, ED431C 2021/53, ED431G 2019/01]
dc.relation.projectIDUniversidade da Coruna/CISUG
dc.relation.publisherversionhttps://doi.org/10.1016/j.iot.2023.100925
dc.rights.accessRightsopenAccess*
dc.subject.keywordINIBIC
dc.subject.keywordINIBIC
dc.typefidesArtículo Científico (incluye Original, Original breve, Revisión Sistemática y Meta-análisis)
dc.typesophosArtículo Original
dc.volume.number24


Ficheros en el ítem

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

Attribution 4.0 International (CC BY 4.0)
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution 4.0 International (CC BY 4.0)