Skip to content

OSC based prototyping and native sensors

Enric Llagostera edited this page May 3, 2019 · 3 revisions

Why two workflows?

“Prototypes stimulate reflections, and designers use them to frame, refine, and discover possibilities in a design space.” [1] How can a tool enable this simple manifestation that helps designers to quickly and purposefully engage with an idea? In the first phase of the how-not-to-use-a-phone project, I engaged in a process of experimenting with different tools and environments for accessing smartphone sensors, from Java programming to browser-based cross-platform solutions and game engines. I found a series of difficulties in each of these tools: latency, difficulty of setup, lack of documentation, poor performance, difficulty to connect multimodal information, lack of support to context-switching (code / visual tools). I ended up going back to the tool that I’m most comfortable with (Unity game engine) to speed up the process and because it works well with a few of these requirements.

However, while designing Shellphone, I found that the process of prototyping with phone sensors was slow even with a good sensor plugin that enabled access to them beyond Unity’s own APIs. From my design logs, I identified two major issues when doing so:

[2019-02-28] 1) running the game with the Unity Remote app, which has quite poor performance and limited access to sensors and 2) having to rebuild the phone app every time you want to run a test or getting the current values of a set of sensors.

This back-and-forth slowed down each step in crafting a prototype, in particular the tuning and mapping of sensor values to different outputs and game logic. This caused a common issue of forgetting or overlooking information that motivated that particular iteration of the prototype. There is a certain frame or rhythm to prototyping that can be difficult to restart if disturbed. It also made it impossible to use visualization tools (like the Monitor Components plugin) to fine-tune and explore different combinations and treatment to input values.

Why OSC?

Open Sound Control (OSC) is a messaging protocol for controlling sound applications (synthesizers, for instance) distributed across devices or applications. It was created as an alternative to MIDI, and it supports some interesting features: in particular, it can “serve” values in specific addresses via networks. There is also a whole ecosystem of tools and plugins for sending and receiving OSC messages, including Sensors2OSC and OscJack. By combining these two tools, I was able to send sensor data from an Android phone to the Unity editor application via wifi, which makes it possible to quickly test things within the PC environment.

The Sensors2OSC app is a free app, distributed via the F-Droid website, which lets you enter the IP address of the target device and then send sensor data from the sensors via OSC using a visual interface. Here is a screenshot of it working:

Sensors2OSC screenshot

OscJack is a Unity plugin in which you can specify in the Inspector a port to receive messages in, their OSC address and then which method should be called with that information.

Native sensors

After the tuning and in-editor prototyping is done and it is time to test the app in the phone, there is no need to use OSC, so the Sensing Gestures tool goes back to using the components from the Unity Android Sensor plugin. Then, it is possible to use the sensor values directly in the same device / application. I structured Sensing Gestures so that this transition could be a one-time kind of setup, and help designers focus on other aspects of the creation process.

An interesting thing to notice is that mmeiburg’s plugin is based on the ScriptableObject functionality in Unity, which means that the values read by the phone are stored in variables that are publicly accessible across the project. This is an interesting decision that enables many entry points for this data, and it was definitely one of the key concepts to the idea of making gestures be flexible in terms of what sensors are being connected to the recognition algorithms.


  1. Youn-Kyung Lim, Erik Stolterman, and Josh Tenenberg. 2008. The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas. ACM Transactions on Computer-Human Interaction 15, 2: 1–27. https://doi.org/10.1145/1375761.1375762
Clone this wiki locally