Citation
Tan, Chun Keat and Lim, Kian Ming and Lee, Chin Poo and Chang, Roy Kwang Yang and Lim, Jit Yan (2023) HGR-ResNet: Hand Gesture Recognition with Enhanced Residual Neural Network. In: 2023 11th International Conference on Information and Communication Technology (ICoICT), 23-24 August 2023, Melaka, Malaysia.
Text
46.pdf - Published Version Restricted to Repository staff only Download (1MB) |
Abstract
Hand Gesture Recognition (HGR) has garnered increasing attention in recent years due to its potential to enhance human-computer interaction (HCI) and facilitate communication between individuals who are mute or deaf and the wider public. HGR can facilitate non-contact interaction between humans and machines, offering an effective interface for recognizing sign language used in everyday communication. This paper proposes a novel approach for static HGR using transfer learning of ResNet152 with early stopping, adaptive learning rate, and class weightage techniques, referred to as HGR-ResNet. Transfer learning enables the model to utilize previously acquired knowledge from pre-training on a large dataset, allowing it to learn from pre-extracted image features. Early stopping serves as a regularization technique, halting the training process before overfitting occurs. Adaptive learning rate adjusts the learning rate dynamically based on the model's error rate during training, promoting faster convergence and improved accuracy. Additionally, the class weightage technique is employed to address the issue of class imbalance in the data, ensuring fair representation and mitigating biases during the training process. To assess the effectiveness of the proposed model, we conduct a comparative analysis with multiple existing methods using three distinct datasets: the American Sign Language (ASL) dataset, ASL with digits dataset, and the National University of Singapore (NUS) hand gesture dataset. HGR-ResNet achieves remarkable results, with an average accuracy of 99.20% across all three datasets, and individual accuracies of 99.88% for the ASL dataset, 98.93% for the ASL with digits dataset, and 98.80% for the NUS hand gesture dataset
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Uncontrolled Keywords: | Hand gesture recognition, ResNet, Sign language recognition, Human-computer interaction |
Subjects: | Q Science > QA Mathematics > QA71-90 Instruments and machines > QA75.5-76.95 Electronic computers. Computer science |
Divisions: | Faculty of Information Science and Technology (FIST) |
Depositing User: | Ms Nurul Iqtiani Ahmad |
Date Deposited: | 31 Oct 2023 08:37 |
Last Modified: | 31 Oct 2023 08:37 |
URII: | http://shdl.mmu.edu.my/id/eprint/11803 |
Downloads
Downloads per month over past year
Edit (login required) |