An egocentric pointing gesture dataset.
Codebase and Public EgoGestAR Dataset: https://github.com/varunj/EgoGestAR
Public Testing Dataset: https://github.com/varunj/EgoGestAR/tree/master/testvideo
- The left column shows standard input gesture sequences shown to the users before the data collection.
- The right column depicts the variation in the data samples.
These gestures could be applied to different use cases.
- The black block depicts swipe gestures (Left, Right, Up, Down) for list interaction.
- Green block showing Rectangle and Circle gestures for Region of Interest (RoI) highlighting.
- Red block (Checkmark: Yes, Caret: No, X: Delete, Star: Bookmark) gestures for evidence capture (in, say, industrial AR applications.)
The highlighted point in each gesture indicates the starting position of the gesture.
- Lists 240 gesture videos captured in varying environments, lighting conditions and by various users.
- 22 videos per class and additional 20 random hand movement videos.
- File naming convention: testvideo_gesturename_serialnumber.mp4
- Lists 500 gesture inputs used for testing. 50 gestures per class.
- File naming convention: test_gesturename_serialnumber.txt
- Lists 500 gesture images corresponding to the inputs. Plotted in black and white with a circular dot representing the starting of the gesture.
- File naming convention: test_gesturename_serialnumber.png
- Lists the time taken (in seconds) to perform each test gesture.
- Lists 2000 gesture inputs used for training. 200 gestures per class.
- File naming convention: train_gesturename_serialnumber.txt
- Lists 2000 gesture images corresponding to the inputs. Plotted in black and white with a circular dot representing the starting of the gesture.
- File naming convention: train_gesturename_serialnumber.png
- Lists the time taken (in seconds) to perform each training gesture.
- Run to generate ground truth files for gestures.
- Run to generate ground truth for custom gestures and corresponding time taken to draw each gesture.
- Gestures saved in train_.txt separated by '----------'.
- Reads from the specified folder.
- Generates a graphical representation of the gestures in 10 graphs (to interpret the gesture).
- Accepts the input gesture during runtime.
- Outputs as above in <4generate_results>.
- Reads '-----------' separated gestures and stores as different files.
- Outputs a histogram of the length of raw gestures captured.
- Up/Down samples raw gestures to 200 length.
- Requires Kivy (https://kivy.org/docs/installation/installation.html)