If an AR system can be thought of as one that combines real and virtual processes, is interactive in real-time, and is registered in three dimensions; why do we witness the majority of AR applications utilising primarily visual displays of information? I propose a practice-led compositional approach for developing multisensory AR experiences’, arguing that, as an medium that combines real and virtual multisensory processes, it must explored with a multisensory approach.
This project uses the open-source Project North Star HMD from Leap Motion alongside bone-conduction headphones to deliver a spatialised audio-visual experience via Unity called polaris~. This repository started off as a fork of the Software Companion for Project North Star, hence the other repository contributors and long list of commits. However, the experience itself including all audio-visual / artistic / musical content was added afterwards.
- Listening Mirrors: an audio AR interactive installation by my PhD supervisors
- Laetitia Sonami: pioneer in early glove-based interactive music systems
- Atau Tanaka: interactive gestural synthesis using muscle sensors
- Keijiro Takahashi specifically their work with audio-reactivity in Unity.
- Tekh:2 has created XR instruments using granular synthesis in Unity.
- Amy Brandon creates amazing musical AR performances.
- Noah Zerkin (CombineReality) for their help in understanding some specifics workings of the North Star headset.
- Damien Rompapas (BEERLabs / ThinkDigital) for their explaining and debugging of the Software Companion to me.
- Bryan Chris Brown (CombineReality) for their moderation of the very friendly Discord server and considerable explanations of the benefits of working with the North Star headset.
-
Project North Star is the 3D printable AR headset by LeapMotion that has been open-source since 2018.
-
Software Companion for Project North Star is developed by Damien Rompapas at BEERLabs / ThinkDigital. If you use polaris~ in an academic context, please cite their paper
-
LibPdIntegration is developed by Niall Moody at Abertay University, with assistance from Yann Seznec. It is licensed under the MIT License.
-
Automatonism is developed by Johan Erikkson.
Bilbow, S. (2022). Evaluating polaris~ - An Audiovisual Augmented Reality Experience Built on Open-Source Hardware and Software. NIME 2022. https://doi.org/10.21428/92fbeb44.8abb9ce6
or with BibTeX