ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Deep Learning-based Deaf&Mute Gesture Translation System

Journal: International Journal of Science and Research (IJSR) (Vol.9, No. 5)

Publication Date:

Authors : ; ; ;

Page : 288-292

Keywords : Human-computer-interaction HCI; Hand Gesture Recognition HGR; American Sign Language ASL; Image Segmentation; Convolutional Neural Network CNN;

Source : Downloadexternal Find it from : Google Scholarexternal


Translation System (TS) is a method used by Deaf individuals to interact with other ordinary persons in the world. Since computers are an important component of our community, the progress of human-computer interaction (HCI) has supported disabled people. And the main purpose of our proposed system is to progress an intelligent system which can turn as a translator between normal and deaf or dumb individuals and can be the communication path between people with speaking deficiency and normal people with both effective and efficient ways. The proposed system consists of a Convolutional Neural Network (CNN) based on the deep learning algorithm for effective extraction of handy features to understand the American Sign Language (ASL), for classifying the hand sign. This paper constructs to interpret ASL and also provides a complete overview of deep learning-based methodologies for sign language recognition. The proposed solution was tested on data samples from ASL data sets and achieved an overall accuracy of 96.68 %. The proposed system was appropriate and reliable for Deaf individuals. Furthermore, an efficient and low-cost Hand Gesture Recognition (HGR) system for the real-time video stream from a mobile device camera. A separate individual hand gesture is utilized for validation in this article. The proposed system has to be designed with the front of the camera and the output is given in the form of text or audio.

Last modified: 2021-06-28 17:06:43