Skip to content

jopika/nwHacks2017-LeapLanguage

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

LeapLanguage

ASL Interpretation using Leap Motion

Inspiration

Disabled persons have always been limited in their methods to communicate with the world. Due to the rapidly growing popularity and devleopment towards VR, we wanted to implement a method to allow mute people to communicate in VR using their "natural" language.

What it does

Leap motion detects sign language gestures converting it into command line text. Implementation in reality would be to allow mute people to sign in VR and communicate with those people who don't understand sign language.

How we built it

We used Leap Motion SDK and built it using python 2.7.

Challenges we ran into

Lots of technological difficulties. The most trouble we had was the intimidating factor of trying to understand Leap Motion API in a short amount of time. This was the first time we worked with Leap Motion.

Accomplishments that we're proud of

Managing to produce a minimum viable product despite having switched project ideas multiple times in 24 hours.

What we learned

We learned how to use LeapMotion API as well as python.

What's next for LeapLanguage

Add text to voice feature to allow mute people to sign in VR and having the ASL being interpreted as voice. This will have many applications such as in conference calls, mute people will be able to dynamically voice their opinions.

How to start using this:

  1. Clone Repo
  2. Install Leap Motion SKD (Orion version is strongly suggested)
  3. Install Python 2.7.13
  4. Navigate to the src folder
  5. Run main.py
  6. Place you hand over the Leap Motion device and make a sign
  7. View result in console print

About

ASL Interpretation using Leap Motion

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages