Skip to content

Discussion

Enric Llagostera edited this page May 3, 2019 · 5 revisions

The smartphone device and its proximity and fit to the body has been an as inspiration for quite a few interesting games and design approaches (Bounden, Don’t trip, Snek). For instance, in Pippin’s discussion of his design process for Snek, the tension between usability/accessibility of movement-based and awkward interactions was key [1]. Interacting with phones is a socially perceived and habit-centric endeavour, and navigating that seems to be a major element when designing these experiences.

An important motivation behind this tool as the idea of exploring the friction between recognition and performance, especially as it connects to sensors that are routinely observing and shaping everyday use of technology. I hope it provides a somewhat easier (or at least a bit less code-centric) way of experimenting with recording and performing gestures from sensors, so that creators can focus on how they can build from this model and more quickly iterate at a more abstract or embodied way.

Opening a bit the access to phone sensors in ways that also engage with the question of how to map sensor values to actions, through this proposed gesture model. By being open-ended about the sensor inputs being used, the tool opens up the idea of gestures as being about changes over time, not necessarily physical movement or embodiment, that are perceived and re-enacted: they might be about temperature or light, for instance. They might have names and prescribed meanings. The basic flow of perception, preserving and performance is the playful interest here.

During the design process, movement-based game design guidelines were productive inspiration. They helped me think about gestures, in particular, the ideas of embracing ambiguity, mapping movements in imaginative ways and the idea of performing movement socially" [3]. How do the changes over time being framed as gestures engage with ambiguity, with mapping and their social performance? The movement-based guidelines do pose relevant considerations when designing and considering gestures more broadly, even if they were thought from a human-body perspective and the gesture proposal is to focus more on a wider array of sensing.

Gesture-based input as present in Sensing Gestures is ambiguous in its inaccuracy of recognition, and the noise in its classification results could be explored further as playful tension between the game and the players. Its an interaction that is less based on a constant feedback loop of minute adjustments [4,5] and more about a difficulty in establishing and maintaining a channel. The idea of input tools as opening up ways of considering the social and historical continuities and ruptures in play [2] is fertile ground to be experimented with in this broadly-defined gesture-based game design. What if the multi-input to single-output ideas and the logics of tool-assisted speedruns can be included by designers as gestures within the games? What if the training sets are recorded from moments of gameplay and become memories to be re-encountered in surprising ways?

Gesture as a technology is very much analogous to tracking and matching systems. It was difficult to play with it in Red Dirt without connecting the experience to captchas or predictive tests and to their more tense cousins like facial recognition and advertising profiles. Feature-mapping and classification are invoked, but with intentional start/end as a promising flow for interaction that springs from the technological constraints of the specific algorithms used. One way to re-think this through play would be to weave both the recognition of gestures and recording of training sets as playful actions. Maybe as elements in a multiplayer social exchange? Hopefully the design of Sensing Gestures can help with bringing these experiments forward.

Process

Here are some notes and extracts from the design logs. As the Sensing Gestures tool was designed together with the Red Dirt game, I tried to group here the notes that I believe engage more with tool functionality questions and design decisions.

Developing the OSC prototyping flow

The idea of using OSC prototyping is discussed in more detail over here. Here is an excerpt of first testing and ideas behind this decision.

[2019-02-28] OSC as prototyping protocol / tool

My idea is to use a flow like this:

  1. The sensor information is read from the Android device and sent via OSC using the free app called Sensor2OSC
  1. On the Unity side, I’ll add the OscJack library, to receive these messages and parse them as input.

With this setup, I’d be able to bypass two steps that took quite a long time and fiddling with when creating the shellphone experiment: 1) running the game with the Unity Remote app, which has quite poor performance and limited access to sensors and 2) having to re-build the phone app every time you want to run a test or getting the current values of a set of sensors. Skipping these two steps will make it much quicker to figure out the numbers and sensitivity of the sensors, and how they could be used expressively.

Ok, in about 5 minutes I have a sample running that shows the light sensor value on screen, with minimal setup. I’ll try now to make something that allows me to visualize the magnetic field values. I implemented a simplified 3-bar visualization, which I then remixed to be able to work with negative values. After that, I used the value of one of the axis to create a proximity detector for a houseplant vase with a magnet on its vase.

Plant friend!

Test with OSC relay of sensor values: I believe this does show that working with the OSC based flow works well. I was able to try out different ideas on how to use the magnetic field values in quick iterations. This is definitely a working system to be considered further.

Gesture recognition issues

A constant effort throughout the development process was to balance work on the gesture recognition accuracy with the mapping of its results to game related actions. An important issue became how to process, average or interpolate this data, which lead to the development of a few componentes (AverageFilter, InterpolateFilter) which can be placed as intermediate steps between the sensors and the training set recorder / gesture recognizer. While this initial set of components was useful, more work would need to be done to conceptualize this process in more visual and flexible ways.

[2019-04-05] I spent a lot of time trying to improve the gesture recognition based on the magnetic field data. I didn’t have much success. Even pre-processing the date in different ways, and making sure I could visualize it effectively (with the Monitor Components interface)., I was unable to get a better rate of success from the gesture recognizer: it could not consistently find a gesture before it would start confusing between them as soon as I added more than 2 to the training set.

I believe I have basically one course of action to try out. That would be running the game with stronger magnets, possibly arranged so that one of the poles points directly at the mobile. I’ll try it out next time I’m at Concordia.

Moving on from OSC prototyping

In order to be able to experiment with the sensors reading on a standalone phone app (instead of communicating with a PC game host), I’m going to implement the unity-android-sensors library into the project. (…) This worked pretty well, and to do that I had mostly to add a few components and change how they are wired together. There wasn’t a lot of changes to the already existing components.

References

  1. Pippin Barr. 2013. Snek Snekked. Retrieved May 3, 2019 from https://www.pippinbarr.com/2013/06/14/snek-snekked/

  2. Stephanie Boluk and Patrick LeMieux. 2012. Hundred Thousand Billion Fingers: Seriality and Critical Game Practices. Leonardo Electronic Almanac 17, 2. Retrieved January 21, 2019 from http://journals.gold.ac.uk/index.php/lea/article/view/174

  3. Florian Mueller and Katherine Isbister. 2014. Movement-based Game Guidelines. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’14), 2191–2200. https://doi.org/10.1145/2556288.2557163

  4. Penelope Sweetser and Peta Wyeth. 2005. GameFlow: a model for evaluating player enjoyment in games. Computers in Entertainment 3, 3: 3. https://doi.org/10.1145/1077246.1077253

  5. Steve Swink. 2009. Game feel: a game designer’s guide to virtual sensation. Morgan Kaufmann Publishers/Elsevier, Amsterdam ; Boston.

Clone this wiki locally