A wearable gesture interface
Check out the TED video to get a good idea of what the project is about.
For more information on SixthSense software and hardware, see:
Detailed documentation is in progress.
There are currently some problems plaguing 64 bit systems that we're trying to fix. Until then, follow these steps to debug the code under 64-bit systems (in Visual Studio):
From top menu select "Build", then "Configuration Manager". After adding a new "x86" platform inherited from debug platform, we see a new platform in the list and the old one is removed.
From the new platform displayed we click on Platform column to see a dropdown containing 3 options "or more", the first is "Any CPU", the second is , and the third is , select , a window will open to choose the platform.
Select x86 then hit OK.
Click debug, it runs as it should.
This is where most of the discussion goes on.
We use C# (tested on Windows, not Mono) with OpenCV (for .NET).
To get started, there's all kinds of indentation issues in the codebase, fix those so you can get a feel for how all the code is laid out.
We use tabs-as-spaces, with 4 spaces for identation. There's all kinds of identation issues, and we encourage people to fix those.
The current code only runs on Windows, under the CLR virtual machine (i.e. C#).
Mono (Linux, Mac OS X, etc.) - Poincare101 and Arup - https://github.com/Poincare/sixthsense (see mono branch)
If you're working on a porting SixthSense to a different environment (*nix, Mac OS X, Android, etc.) please list it here, with some kind of link to code.