Hand Gesture Recognition with Deep Convolutional Neural Networks: A Comparative Study

Citation

Chong, You Li and Lee, Chin Poo and Lim, Kian Ming and Lim, Jit Yan (2023) Hand Gesture Recognition with Deep Convolutional Neural Networks: A Comparative Study. In: 2023 IEEE 11th Conference on Systems, Process & Control (ICSPC), 16-16 December 2023, Malacca, Malaysia.

[img] Text
39.pdf - Published Version
Restricted to Repository staff only

Download (930kB)

Abstract

Hand gesture recognition is a growing field with applications in human-computer interaction, sign language interpretation, and virtual/augmented reality. The use of convolutional neural networks (CNNs) has become prevalent in this field as they possess the capability to autonomously extract relevant features from image data, facilitating precise and effective hand gesture recognition. This paper presents a comparison of popular pretrained CNN models for hand gesture recognition, evaluating their performance on three widely used datasets: the American Sign Language (ASL) dataset, ASL with Digits dataset, and NUS Hand Posture dataset. The models were fine-tuned and tested, and the analysis included accuracy, training epoch, and training time. The pretrained CNN models compared include VGG16, ResNet50, InceptionV3, DenseNet201, MobileNetV2, Inception ResNetV2, Xception, and ResNet50V2. The findings of this research can provide valuable insights into choosing an appropriate pretrained CNN model for applications involving hand gesture recognition

Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Convolutional Neural Network
Subjects: Q Science > QA Mathematics > QA71-90 Instruments and machines > QA75.5-76.95 Electronic computers. Computer science
Divisions: Faculty of Information Science and Technology (FIST)
Depositing User: Ms Nurul Iqtiani Ahmad
Date Deposited: 27 Mar 2024 02:07
Last Modified: 27 Mar 2024 02:07
URII: http://shdl.mmu.edu.my/id/eprint/12198

Downloads

Downloads per month over past year

View ItemEdit (login required)