Skip to content

Techbusters/Arduino-Sign-Language-Recogniser

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Arduino Sign Language Recogniser

An apparatus for translating American Sign Language (ASL) using Machine Learning and Arduino Uno.

PURPOSE :

This apparatus is designed for bridging the communication gap between the deaf & dumb people and others of the society. The primary language for communication is the American Sign Language (ASL). Generally, people who are not an expert in this language require an additional human translator to communicate with the deaf & dumb people. Our project aims at creating a model which uses Convolutional Neural Networks in Machine Learning to translate ASL into any other language with the support of Arduino Uno.

STRUCTURE :

PART 1 -- We integrate the Arduino based hand Motion Detector for easing of hand-to-eye coordination. The camera will follow the hand motion and rotate to align itself accordingly, so that the user does not have to stay stationary and perform their gestures in a limited boundary.

PART 2 -- The apparatus uses Keras backend to predict the class of the gesture captured by the webcam using previously trained model. The captured footage is processed using an OpenCV framework. Then, the model is trained on a dataset which was generated by us. Then, the output is displayed as text in a GUI developed using Tkinter.

FILE STRUCTURES :

  1. The trained model is available in myModeldataset1
  2. The Arduino code is available in MotionSensorCamera.ino
  3. The executable file is Model_predictor.py

About

An apparatus for translating American Sign Language (ASL) using Machine Learning and Arduino Uno

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •