Skip to content

An implementation of an application using gestures to interact with it.

Notifications You must be signed in to change notification settings

conormc93/Gesture-Based-UI-Project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gesture Based UI Project - Myo Armband | Unity


Written by Conor McGrath & Danielis Joniškis

Table of Contents

  1. Introduction
  2. Purpose of the application
  3. Gestures identified as appropriate for this application
  4. Hardware used in creating the application
  5. Architecture for the solution
  6. Conclusions & Recommendations
  7. References

Brief

"Develop an application with a Natural User Interface. There are a number of options available to you and this is an opportunity to combine a lot of technology that you have worked with over the past four years...At the very least, this should be a local implementation of the application using gestures to interact with it...You can reproduce a classic game or system using a gesture-based interface."

Introduction

We decided to recreate the classic arcade game Space Invaders for the purpose of this project. The reason we decided to make this game is because:

  • We both have experience in creating this type of game using C# & Unity
  • The keyboard controls used throughout the game can be easily mapped to Myo gestures
  • This would allow us to focus more on the functionality of the hardware component and furthermore, allow us to better determine its performance and integration with Unity and our game.

One of the most important research areas in the field of Human-Computer-Interaction (HCI) is gesture recognition as it provides a natural and intuitive way to communicate between people and machines. Gesture-based HCI applications range from computer games to virtual/augmented reality and is recently being explored in other fields.


Purpose of the Application

The purpose of this application is to demonstrate how to use the Myo Armband to detect the electrical activity of muscles and control gestures and highly sensitive motion sensors detected by proprietary EMG muscle sensors to control the application to perform a series of operations. The Myo Armband has an Electromyogram (EMG) sensor that directly senses muscle activity and movement to read muscle activity in a refined manner.

This project is an example of the integration of Unity game engine with Myo. It is a good demonstration of the use of Myo gestures to perform a series of actions.


Gestures identified as approrpriate for this application

The Myo is an armband equipped with several Electromyography (EMG) sensors that can recognize hand gestures and the movement of the arms. Based on the electrical impulses generated by muscles, 8 EMG sensors are responsible to recognize and perform each gesture.

gestures

When we came to an agreement to use the Myo armband for our application we first researched what gestures were available with this technology. We found out that the Myo armband recognizes 5 pre-set gestures out of the box.

We also looked into how user friendly the myo armband was for our application. As our lecturer kindly gave us a myo armband to work with we were able to figure out for ourselves just how user friendly the myo armband actually is. We found that the armband fits very comfortably on the users arm. No matter what size arm the user has it stretches and adjusts itself to the arm of the user.

For this project, Myo provides the following intuitive hand movements/ gestures:

fist The user makes a fist to shoot projectiles.


wavein The user waves in (Left) their wrist and the character ship moves left.


waveout The user waves out (Right) their wrist and the character ship moves right.


spread The user spreads out their fingers to ause the game and resume the game.

The menu can also be interacted with using these features.


Hardware used in creating the application

Before we decided what hardware we wanted to use for our project we began researching the different options readily available to us to use.

  • Leap Motion Controller
  • Myo Armband
  • Kinect

Below are some comparisons we drew up to determine which device would best fit the solution we were trying to implement.

Leap Motion Controller Myo Armband Microsoft Kinect
Finger Tracking Fixed Gestures Full Body Tracking
HMD Mountable Wireless High Latency
3D Hand Tracking 2D Arm Tracking Useful At A Distance

All three are NUI's and don't require any remote to control them as they all use markerless technologies.

Myo Armband

The hardware we chose for our application is the Myo Armband. We choose to use this as it is a top of the range gesture control armband. The myo armband was also available to us in the college, so that made it even more attractive to use.

The Myo armband can be used in numerous applications due to its excellent technical features and ease of use. The EMG electrodes detect the signals related to muscles activity of the user’s forearm and the IMU detects the forearm movements in the 3D space. The acquired data are sent, via the Bluetooth Low Energy (BLE) module embedded into the armband, to other electronic devices (actuators, microcontrollers, and so on), which perform specif-ic functions depending on the received data and on their installed software.

List of hardware components:

  1. Myo Armband
  2. 8 electromyographic (EMG) electrodes
  3. 9-axes inertial measurement unit (IMU)
    • 3-axes gyroscope
    • 3-axes accelerometer
    • 3-axes magnetometer
  4. Myo sizing clips
  5. Bluetooth module
  6. Windows PC/Laptop or Mac( Windows Virtual Machine )
Leap Motion Controller

Another alternative hardware we could have used to build this application would be a Leap Motion. The Leap Motion Controller works alongside your mouse and keyboard, translating hand and finger movements into a rich array of 3D input.

The Leap Motion system recognizes and tracks hands, fingers and finger-like tools.The device operates within 60 cm of the user with high precision and tracking framerate – reporting discrete positions, motions, and gesture. This would have been good to use in our project also, but the myo armband was more practical for what we wanted to achieve.

Microsoft Kinect v2

The Kinect works as a 3D camera by capturing a stream of colored pixels with data about the depth of each pixel. Each pixel in the picture contains a value that represents the distance from the sensor to an object in that direction[10]. This hardware featureprovidedevelopers the means for creating a touch-less and immersive user experience through voice, movement and gesture control although it does not inherently perform any tracking or recognition operations, leaving all such processing to software.

The Kinect for Windows Software Development Kit (SDK) 2.0 enables developers to create applications that support gesture and voice recognition, using Kinect sensor technology. Kinect API is available in Unity Pro, through a Unity Package. The new cameras provide a wider field of view and feature frames of higher resolution.


Architecture for the solution

Flowchart

architecture When deciding how to approach this project collectively we decided to choose to do the project using Microsoft's Visual Studio IDE for C# and the MYO Armband. We chose this as we had been working with both while in class and were familiar with coding in C#.

Player Movement

movement

Shooting Projectiles

shooting

In game pause menu

pause


Testing

Introduction

To test the functionality of our game we highlighted critical aspects for running, interacting, and completing it. We wanted to evaluate the integration and performance of our application, therefore allowing us to better manage the project and determine what components or features of the application needed to be retested or redesigned.

Scope

The main focus for our testing process was to indetify and perform any necessary bug fixes to various components and features such as:

  • Myo integration with Unity
  • Character controls (using keyboard)
  • Character controls (using gestures)
  • Enemy controls
  • Game rewards
  • Game progression
  • Game navigation
Test Plan

testplan


Conclusions & Recommendations

This latest technology can help us to minimize the reliance on hardware and software for controlling robots and other devices. Myo Armband can easily realize the back screen control computer, in addition to playing computer games, browsing the web, controlling music entertainment and other entertainment activities, and even control the drone. Compared to Kinect and Leap Motion, the advantage of Myo is that it is not restricted by the specific site and it is more natural to interact. The sensor on the Myo armband is internally equipped with electrodes so that the user can read out the bioelectric activity of the muscle when the user makes a telescopic gesture, thereby determining the intention of the wearer, and then sending the result of the computer processing to the receiver via the low-power Bluetooth control equipment. I think that Siri's voice interaction is not perfect for a large number of young people who are accustomed to keyboard input and textual ideograms.


References:

About

An implementation of an application using gestures to interact with it.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages