Abstract:

People with impaired speech and hearing uses Sign language as a form of communication. Disabled People use this sign language gestures as a tool of non-verbal communication to express their own emotions and thoughts to other common people. Conversing with people having a hearing disability is a major challenge. Deaf and Mute people use hand gesture sign language to communicate, hence normal people face problems in recognizing their language by signs made. Hence there is a need for systems that recognize the different signs and conveys the information to normal people. But these common people find it difficult to understand their expression, thus trained sign language expertise are needed during medical and legal appointment, educational and training session. To address this problem, we can implement artificial intelligence technology to analyse the user’s hand with finger detection. In this proposed system we can design the vision-based system in real time environments. And then using deep learning algorithm named as Convolutional neural network algorithm to classify the sign and provide the label about recognized sign. Also extend the framework to capture the user speech and convert into text format using Hidden Markov Model. Finally visualize the sign-based text recognition using Natural language processing in terms of AVATARS.
Keywords - Hand image acquisition, Binarization, Region of finger detection, Classification of finger gestures, Sign recognition.