An XNA project to control an Ar.Drone with Kinect gestures and voice commands while the video feed of the cameras is streamed to the Oculus Rift (can be disabled by setting drawOculus=false in the class OculusParrotKinect.cs)
Voice commands are specified in the class VoiceCommands.cs
Commands supported are: take off, land, emergency, change camera
Activate and deactivate Face recognition
With both arms forward -> move the drone forward
With both hands near the shoulders -> move the drone backward
With left arm extended to the left and right arm along the body -> move the drone to the left
With left arm extended to the left and right arm forward -> move forward and to the left the drone
With left arm extended to the left and right hand near the right shoulder -> move backward and to the left the drone
The right movements are simply the left ones mirrored.
For the Ar.Drone library, I used the one from Ruslan Balanukhin.He created the AR.Drone projet and the .NET FFMpeg wrapper.
For the Oculus implementation, I used the Sunburn StereoscopicRenderer plugin as a starting point. It's an implementation made by the guy behind the Holophone3D.
I learned a lot from these projects.
This is just a project I made for fun and as an excuse to mess around with some nice gadgets.
The code is a mess and don't expect fancy HUD or whatever, it just works :).
Till now I was only focused on having the pieces to work together. I plan to organize and better implement the code in my spare time.
Here you can see a simple video of it working
- Fixed some bugs
- Added faces detection activated by voice commands