Synthetic Sensor Data Generation for Activity Recognition
Ishwarya K. 1
, N. Sowmya 2
, A. Alice Nithya3
, Nandhini G. 4
and D. Devi 5
1 Department of Computing Technologies, School of Computing, SRM Institute of Science and Technology Kattankulathur – 603 203
2 Department of Electronics and Communication Engineering. , SRM Institute of Science and Technology, Kattankulathur, 603203, Tamilnadu,India
3 Department of Computational Intelligence, School of Computing,SRM Institute of Science and Technology, Kattankulathur, 603203, Tamilnadu,India
3 Department of Computational Intelligence, School of Computing,SRM Institute of Science and College,West Tambaram poonthandalam,Chennai -602109
5Department of Computer Science and Engineering , Sathyabama institute of science and technology,Jeppiaar Nagar, Chennai-600 119.
ishwaryk3@srmist.edu.in
sowmyan1@srmist.edu.in*
alicenia@srmist.edu.in
devi.cse@sathyabama.ac.in
nandhini.am@sairam.edu.in
DOI: 10.46793/BISEC25.326I
ABSTRACT: Human Activity Recognition (HAR) is essential for a variety of appli- cations, including assisted living, fitness tracking, and healthcare monitoring. The accelerometer, gyroscope, and magnetometer—three inertial sensors that are built into smartphones—are used in this paper’s hybrid deep learning framework for sensor-based HAR. The raw sensor data goes through a thorough preparation step that includes mean/mode replacement techniques for handling missing val- ues and Kalman Filter identification for outliers. For efficient feature extraction, the cleaned data is subsequently fed into a Modified One-Dimensional Convolutional Neural Network (1D-CNN), which captures local temporal relationships in sensor signals. A Long Short-Term Memory (LSTM) network is then fed this data in order to identify intricate human actions and learn long-range temporal patterns. Real-time, precise activity prediction is possible with the suggested method, opening the door for context-aware applications in ubiquitous computing settings. Examining the possibility of using transfer learning strategies to modify the created models for use in new activity recognition tasks or data-poor domains. Models are being optimised for low-power and resource-constrained environments, particularly for wearable technology, to guarantee prolonged use without using excessive amounts of energy.
KEYWORDS: Kalman Filter, Convolutional Neural Network, Long Short-Term Memory , Temporal patterns.
REFERENCES:
- Wu, Jingwen, et al. “Detecting the quantitative hydrological response to changes in climate and human activities.” Science of the Total Environment 586 (2017): 328-337.
- Xu, Yang, et al. “Propagation from meteorological drought to hydrological drought under the impact of human activities: A case study in northern China.” Journal of Hydrology 579 (2019): 124147.
- Ni, Bingbing, et al. “Multilevel depth and image fusion for human activity detec- tion.” IEEE transactions on cybernetics 43.5 (2013): 1383-1394.
- Ke, Shian-Ru, et al. “A review on video-based human activity recognition.” Computers 2.2 (2013): 88-131.
- Jiménez‐Bonilla, Alejandro, Manuel Díaz‐Azpiroz, and Miguel Rodríguez Rodríguez. “Tectonics may affect closed watersheds used to monitor climate change and human activity effects.” Terra Nova 35.1 (2023): 58-65.
- Ronao, Charissa Ann, and Sung-Bae Cho. “Human activity recognition with smartphone sensors using deep learning neural networks.” Expert systems with applications 59 (2016): 235-244.
- Csurka, Gabriela. “Domain adaptation for visual applications: A comprehensive survey.” arXiv preprint arXiv:1702.05374 (2017).
- Jain, Yashswi, et al. “PoseCVAE: Anomalous human activity detection.” 2020 25th Inter- national Conference on Pattern Recognition (ICPR). IEEE, 2021.
- Zhang, Ying, et al. “Vegetation dynamics and its driving forces from climate change and human activities in the Three-River Source Region, China from 1982 to 2012.” Science of the Total Environment 563 (2016): 210-220.
- Piergiovanni, A. J., et al. “Adversarial generative grammars for human activity predic- tion.” Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part II 16. Springer International Publishing, 2020.
- Mukherjee, Debadyuti, et al. “EnsemConvNet: a deep learning approach for human activ- ity recognition using smartphone sensors for healthcare applications.” Multimedia Tools and Applications 79 (2020): 31663-31690.
- Javed, Abdul Rehman, et al. “A smartphone sensors-based personalized human activity recognition system for sustainable smart cities.” Sustainable Cities and Society 71 (2021): 102970.
- Shoaib, Muhammad, et al. “Complex human activity recognition using smartphone and wrist-worn motion sensors.” Sensors 16.4 (2016): 426.
- Li, Haobo, et al. “Bi-LSTM network for multimodal continuous human activity recognition and fall detection.” IEEE Sensors Journal 20.3 (2019): 1191-1201.
- Hassan, Mahmoud O., and Yasser M. Hassan. “Effect of human activities on floristic com- position and diversity of desert and urban vegetation in a new urbanized desert ecosys- tem.” Heliyon 5.8 (2019).
- Ignatov, Andrey, et al. “Ai benchmark: Running deep neural networks on android smartphones.” Proceedings of the European Conference on Computer Vision (ECCV) Workshops. 2018.
- Almaslukh, Bandar, Jalal Al Muhtadi, and Abdel Monim Artoli. “A robust convolutional neural network for online smartphone-based human activity recognition.” Journal of Intelli- gent & Fuzzy Systems 35.2 (2018): 1609-1620.
- Tsinganos, Kanaris, et al. “ERA-PLANET, a European Network for Observing Our Chang- ing Planet.” Sustainability 9.6 (2017): 1040.
- Kautz, Thomas, et al. “Activity recognition in beach volleyball using a Deep Convolutional Neural Network: Leveraging the potential of Deep Learning in sports.” Data Mining and Knowledge Discovery 31 (2017): 1678-1705.
- Saunois, Marielle, et al. “The growing role of methane in anthropogenic climate change.” Environmental Research Letters 11.12 (2016): 120207.
- Ishwarya, K., and A. Alice Nithya. “Performance-enhanced real-time lifestyle tracking model based on human activity recognition (PERT-HAR) model through smartphones.” The Journal of Supercomputing 78.4 (2022): 5241-5268.
- Nithya, A. Alice, et al. “CNN based Identifying Human Activity using Smartphone Sen- sors.” 2022 International Conference on Edge Computing and Applications (ICECAA). IEEE, 2022.
- Kumar, Pranjal, Siddhartha Chauhan, and Lalit Kumar Awasthi. “Human activity recogni- tion (har) using deep learning: Review, methodologies, progress and future research directions.” Archives of Computational Methods in Engineering 31.1 (2024): 179-219.
- Khan, Danyal, et al. “Advanced IoT-based human activity recognition and localization using Deep Polynomial neural network.” Ieee Access 12 (2024): 94337-94353.
- Yang, Xiaoshan, et al. “Cross-modal federated human activity recognition.” IEEE Transactions on Pattern Analysis and Machine Intelligence 46.8 (2024): 5345-5361.
IZVOR: Proceedings of the 16th International Conference on Business Information Security BISEC’2025