Abstract
Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.
Original language | English |
---|---|
Title of host publication | 2021 IEEE International Joint Conference on Biometrics (IJCB) |
Publisher | IEEE |
Number of pages | 8 |
ISBN (Electronic) | 978-1-6654-3780-6 |
DOIs | |
Publication status | Published - 20 Jul 2021 |
Event | 2021 IEEE International Joint Conference on Biometrics (IJCB) - Online, Shenzhen, China Duration: 4 Aug 2021 → 7 Aug 2021 http://ijcb2021.iapr-tc4.org/ |
Conference
Conference | 2021 IEEE International Joint Conference on Biometrics (IJCB) |
---|---|
Abbreviated title | IJCB 2021 |
Country/Territory | China |
City | Shenzhen |
Period | 4/08/21 → 7/08/21 |
Internet address |