Join GitHub today
GitHub is home to over 36 million developers working together to host and review code, manage projects, and build software together.Sign up
Create a demo that visualises the movement of a person #484
@jackie1050 showed me a "constellation" dress somebody had designed. She was using some kind of device which tracked movement to do some gesture interpretation and light specific constellations on the dress. Really impressive. Do you have the link Jackie?
I was thinking this is probably done with something like a MicroBit with an accelerometer.
Could we put something together to track the accelerometer values to generate movement deltas and then stream that over WiFi or Bluetooth BLE? (Would need to look at the accuracy we get out of this but I think it would be good enough for a neat demo)
We could then look at visualisation techniques. One option would be to have Tosca following the motion of the hand to draw something out, Etch-A-Sketch style.
Another option would be to look at some of the neat OpenGL "particle" systems e.g. fire that moves around the screen and show this on a large display.
This isn't exactly what I'm thinking but it's along the right lines. We could control this via hand movements.
With that in place we could then get some dancers in and see what happened?
Thinking about it the accelerometer(s) will probably be good for providing detail and movement deltas, but absolute position will quickly be lost over time.
We could make use of BLE beacons to triangulate absolute position (as they incorporate a distance measure based on received radio power). That might fit well with @aubergine10 's BLE sensor plans?
If we have multiple poles at known distances around an area then we can use these to triangulate and merge the absolute position (which will actually be something like a circle of probability from the multiple triangulations) with the accelerometer data providing detail for short periods of time.
With a camera at the front we should then be able to locate people within the space and overlay our "augmented" graphics on top of the live video (or just replace the live video)
Perhaps @goatchurchprime would be interested in coming up with a clever algorithm for the absolute position at a level of confidence and the movement deltas....
Operating on the same basis as 'this piece of equipment is broken* and hasn't been touched for months and months, so we're getting rid,' I'm closing issues that, while really good ideas, seem to have gone nowhere further than conception and aren't integral to the running of DoES. They're obviously still findable using the search function on github, and I'll create a new 'Gone but not Forgotten' tag for them :)
*(Obviously they're not broken, but you get the idea).