You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sign2Sound is dedicated to revolutionizing communication for non-verbal individuals by seamlessly translating sign language gestures into understandable speech in real-time. By bridging the gap between sign language users and those unfamiliar with it, Sign2Sound promotes inclusivity and accessibility, ultimately enriching quality of life for all.
An automated ASL interpreter. The project will involve a robotic glove that detects hand movements in order to decipher ASL letters. The letters will be displayed on a LCD.
A YOLOv5 model developed from scratch to convey the signs to a blind perosn and can generate the text out from the signs made by mute person. It is a prototype to showcase the possibility on developing a interpreter for mute and blind people.
This project will focus on sign language translation using wearable devices, which is able to help people having troubles with hearing and speaking in real world scenarios 😄
The primary objective of this project is to build a Real-Time Gesture Recognition Model. This model can be proposed as a baseline model for sign interpreter, which automatically converts sign language into written output to make communication for dumb people easy.