Multi-View Human Activity Recognition in Ambient Assisted Living Using Lightweight Deep Learning Models

Citation

Bari, Ahsanul and Abdul Karim, Hezerul and Farid, Fahmid Al and Asaduzzaman, Mina and Amirabdollahian, Farshid and Mansor, Sarina (2024) Multi-View Human Activity Recognition in Ambient Assisted Living Using Lightweight Deep Learning Models. In: 5th International Conference on Electrical, Communication and Computer Engineering, ICECCE 2024, 30 - 31 October 2024, Kuala Lumpur, Malaysia.

[img] Text
Multi-View Human Activity Recognition in Ambient Assisted Living Using Lightweight Deep Learning Models.pdf - Published Version
Restricted to Repository staff only

Download (286kB)

Abstract

Human Activity Recognition (HAR) is crucial for the development of intelligent assistive technologies in Ambient Assisted Living (AAL) environments. This paper proposes an innovative method for Multi-View Human Activity Recognition (MV-HAR) using lightweight deep learning models, specifically MobileNet and Cyclone-CNN (CCNet), to achieve quick and precise activity detection. Utilizing the Robot House MultiView Human Activity Recognition (RHM-HAR) dataset, which contains four different views—front, back, ceiling (omni), and mobile robot—our models effectively address challenges related to viewpoint variation and motion dynamics. The dataset includes 14 multi-view daily living action classes, providing a balanced set of synchronized human actions suitable for multi-domain neural network learning. MobileNet and CCNet are employed for their high recognition accuracy, computational efficiency, and real-time application capabilities in AAL scenarios. We propose a Mutual Information (MI)-based method to assess the redundancy and relevance of each viewpoint, ensuring the fusion of multi-view data with minimum redundancy and maximum relevance. Benchmarking results demonstrate that multi-view combinations significantly enhance recognition performance compared to single-view models, particularly in complex activities involving high levels of movement.

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Human activity recognition (HAR), lightweight deep learning,
Subjects: Q Science > Q Science (General) > Q300-390 Cybernetics
Divisions: Faculty of Engineering (FOE)
Depositing User: Ms Rosnani Abd Wahab
Date Deposited: 20 Feb 2025 07:34
Last Modified: 20 Feb 2025 08:17
URII: http://shdl.mmu.edu.my/id/eprint/13527

Downloads

Downloads per month over past year

View ItemEdit (login required)