+ Show full webm preview - https://gfycat.com/UnsungBlueAmericanshorthair
</p>
-[Show full webm preview](https://gfycat.com/UnsungBlueAmericanshorthair)
-<br>
-**Capacities**
+###Capacities
- Recognition of gestures : palm, thumbs up/down, slide up/down/right/left
- Time and outside temperature
- Widgets :
@@ -24,19 +23,14 @@ Showing the display of the main menu and a use case on the widget Cinema (movie
- DoodleJump : play the game (hard with the latency)
- News : show international news
-
-<br>
-**Debugging**
-
+###Debugging
The motion server can't recognize gestures on a new environment : lights, hand colors... affect the process, that's why by launching
the test.py file and tweak the HSV min/max values and others configs properly you can set it up for your home. To begin the tracking make an open palm like shown in the picture below.
@@ -46,10 +40,8 @@ the test.py file and tweak the HSV min/max values and others configs properly yo
- Raspberry pi
- Camera (I'm using the NoIR but any camera should do)
- LED Monitor (Preferably that cover the whole surface of your two way mirror)
-
-<br>
-**Building**
+###Building
- Web server :
-`npm install`
-`node server.js`
@@ -62,8 +54,6 @@ the test.py file and tweak the HSV min/max values and others configs properly yo
-`config.py` for the tracking settings
- To use the Pi Camera set `piCamera` to true and install the package picamera `pip install "picamera[array]"`
-<br>
-**Notes**
-
+###Notes
The motion server was made in python to learn the language, but it should have been made in C/C++ to gain execution speed and flowness in the gestures recognitions.
Also, python environment kinda sucks at the moment, the difficulty to set it up for python3 + opencv2 + windows and then linux was exhausting, I don't recommend.
0 comments on commit
a9977c4