Citation
Jim, Abdullah Al Jaid and Rafi, Ibrahim and Tiang, Jun Jiat and Biswas, Uzzal and Nahid, Abdullah-Al (2024) KUNet-An Optimized AI Based Bengali Sign Language Translator for Hearing Impaired and Non Verbal People. IEEE Access, 12. pp. 155052-155063. ISSN 2169-3536
Text
KUNet-An_Optimized_AI_Based_Bengali_Sign_Language_Translator_for_Hearing_Impaired_and_Non_Verbal_People.pdf - Published Version Restricted to Repository staff only Download (1MB) |
Abstract
Sign language is the most prevalent form of communication among people with speech and hearing disabilities. The most widely used types of sign language involve the creation of static or dynamic gestures using hand(s). Among many sign languages, Bengali Sign Language (BdSL) is one of the most complicated sign languages to learn and comprehend because of its enormous alphabet, vocabulary, and variation in expression techniques. Existing solutions include learning BdSL or hiring an interpreter. Besides, BdSL interpreter support is hard to come by and expensive (if not voluntary). Disabled people might find it more comfortable to converse with generals implementing machine translation of sign language. Deep learning that mimics the human brain, a subset of the machine learning domain, seems to be a viable solution. For the hearing impaired and non verbal community, computer vision, in particular, may hold the key to finding a solution. Therefore, we have created a novel model, KUNet (‘‘Khulna University Network’’ a CNN based model), a classification framework optimized by the genetic algorithm (GA), has been proposed to classify BdSL. This model and the dataset contribute to creating a BdSL machine translator. GA-optimized KUNet acquired an accuracy of 99.11% on KU-BdSL. After training the model on KU-BdSL, we demonstrated a comparison of the model with state-of-the-art studies and interpreted the black-box nature of the model using explainable AI (XAI). Additionally, we have found that our model outperformed several well-known models trained on the KU-BdSL dataset. This study will benefit the hearing impaired and non verbal community by allowing them to communicate effortlessly and minimizing their hardship.
Item Type: | Article |
---|---|
Uncontrolled Keywords: | Deep learning, machine learning |
Subjects: | Q Science > Q Science (General) > Q300-390 Cybernetics |
Divisions: | Faculty of Engineering (FOE) |
Depositing User: | Ms Nurul Iqtiani Ahmad |
Date Deposited: | 04 Nov 2024 01:44 |
Last Modified: | 04 Nov 2024 01:44 |
URII: | http://shdl.mmu.edu.my/id/eprint/13111 |
Downloads
Downloads per month over past year
Edit (login required) |