Skip to content

Break the language barrier through auto-generated closed captions, derived from hand sign detection using machine learning.

Notifications You must be signed in to change notification settings

acm-projects/ASL-Translator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ASL-Translator

Break the language barrier through auto-generated closed captions, derived from hand sign detection using machine learning.

Key Features

Translate ASL in real time by providing captions as the user is signing various letters.

MVP (Minimum Viable Product)

  • Decide on a model type (e.g. R-CNN, Fast R-CNN, Faster R-CNN, YOLO, or others)
  • Decide on dataset for model training
  • Test the accuracy of the model
  • Live translate as camera is pointed as person is performing sign language

Additional Features - Stretch Goals

  • App can verbally inform and read out the captions to the user
  • Input a video and output text
  • App integrates various sign languages

Dependencies

Flutter

Flutter can be used for the basic front end of the project. The majority of the time will be spent on developing the backend and training the existing dataset for accuracy.

Install by following the guidelines here

General documentation

AWS Rekognition

The model should be trained utilizing AWS Rekognition

AWS Documentation and API Reference

Other cloud platforms to train your model

Resources

Below are some resources to help overcome possible roadblocks during the project

Possible data sets

Inspiration

Prototyping

Learning Resources

Look through all of these resources at the beginning of the semester!

About

Break the language barrier through auto-generated closed captions, derived from hand sign detection using machine learning.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages