Skip to content

Way-finding through virtual spaces using haptics and VR tracking.

Notifications You must be signed in to change notification settings

maxleblanc/sightless-vr

Repository files navigation


Sightless VR

Way-finding through virtual spaces using haptics and VR tracking.
Completed on Dec. 2019

License: MIT

Unity C# C++ Rhino 3D Grasshopper Arduino

Table of Contents

About The Project

Technological innovations in the way of virtual environments accessible via head-mounted displays (HMD), haptic controllers, gloves, and wearables are offering new possibilities for spatial design and mobility. Commercial virtual reality (VR) hardware has historically been conspicuously cephalo-centric. HMDs are trademarks of a common picture associated with current VR. The wearer, as if blindfolded, stops seeing the “real” world around them and is transported into their individual, virtual, immersive “bubble.” From the outside, the bubble seems impermeable, closed, and inaccessible. From the inside, the bubble feels limitless, expansive, and ubiquitous. Contemporary common conceptions of VR picture the technology as a closed immersive bubbles separated from the “real” physical space. However, this conception is not entirely accurate. VR takes place. It is constructed through, and can be broken by, physical space. Using VR systems, partitions and obstacles existing in the digital realm (and bound to physical space) can be manifested through haptic pulses rendering them physical to the user. I argue that this ghostly architecture represents a shift in the way that architecture can interact with end-users.

Aspiring to critique the closed-world attitude toward VR, this research takes on the form of a critical design project (situated on the fifth floor of the Macdonald-Harrington building on the McGill University campus) that translates virtual obstacles into architectures in the physical world; making the virtual environment become part of the physical environment. Superimposing every floor of the Macdonald-Harrington building one on-top of another generates a site with non-arbitrary partition placement. These partitions, walls, and doorways found on lower floors are superimposed on the fifth floor space and are converted into virtual equivalents equipped with haptic feedback.

Ultimately, this thesis project is about developing a system of relationships between the user, the virtual environment and the physical environment as a way to generate questions pertaining to the concept of ghostly architecture and the interplay between virtual and physical space. Through the use of haptic feedback displays, I generated situations which challenge notions of wayfinding and spatial awareness. By skewing and distorting perceptions of space through virtual augmentation, I hope to frame questions surrounding VR (specifically sightless VR) and its uses/limits within physical reality.

More Concepts

Bursting the Bubble (click arrow to expand)

Contemporary conceptions of VR portray the technology a hermetic system in which a user becomes immersed in the simulation, when in actuality, virtual environments are intrinsically dependent on the physical world. VR exists within architecture. It relies on tangible metrics and spatial properties of physical boundaries in which it is contained. As such, VR is architectural by nature. VR takes place. It is constructed through, and can be broken by, physical space. Current generation VR systems such as the HTC Vive use optical tracking to enable room-scale geo-localization of a user within space. The physical boundary in which VR technology is contained is just as much a part of the process of world-making as the hardware itself.

To prevent the bubble from bursting, VR systems often encourage the removal on any potential obstacle which might interfere with the simulation. The outside world is seen as undesirable or antagonistic to the VR illusion.

Vive Setup Screen

Here for example, we see the setup screen for the HTC Vive VR system which asks users to move all objects out of the tracking space.

Ghostly Architecture (click arrow to expand)

Superimposed or ghostly architecture can bring change to the current mode of operation commonly associated with VR and its bubble-centric ideology. By incorporating various layers of ghostly architectural elements, spaces become extended and augmented. Part of this project involves imagining the implication of the ghost in an architectural setting. Ghosts, in this case refers to architectural elements in the form of virtual objects superimposed on physical spaces.

Is the ghostly presence complimentary or opposite to the physical world? How does the boundary of these virtual elements map how a user moves within space? VR has the power to make tangible what one imagines or remembers, as well as bring forth an unexpected reality. In a way, it validates what one thinks and can manifest different versions of existence. The virtual-world slash physical-world duality is bound by rules, and playing with these rules can make for unexpected results.

TL;DR

I created a haptic device allowing users to navigate a virtual room overlayed onto a physical room without the need for head-mounted displays.

haptic device

(back to top)

The Site

The focus of this critical design project was to create a large-scale experiment on the fifth floor of the Macdonald-Harrington building on the McGill University campus that challenges conventional uses of VR systems and generate questions pertaining to the use of haptic feedback in movement and space-making. I started by superimposing every floor one on-top of another to generate a sort of reverse axonometric projection. Instead of exploding level into specific layers, I collapsed and flattened the building into one plan. Now partitions, walls and doorways found on lower floors are superimposed on the fifth floor space and are converted into virtual equivalents equipped with haptic feedback.

project site

Architectural elements, real and virtual, now co-exist on a single floor. For the purposes of simplification, I decided to focus on making tangible basic architectural elements such as walls, windows, doors and columns of lower floors through haptic feedback. The user will experience haptic feedback through a haptic device mounted on the wrist and hand.

virtual walls

The following step was to create a tool for navigating through my ghostly installation. Placement, intensity, frequency, and duration of the haptic events were evaluated in order to create a tool which enhances the experience. Ultimately, a user will be expected to navigate through the fifth floor exhibition room without a HMD. Using only a custom haptic device, they would sense the presence of architectural elements found on lower storeys. Although the partitions are virtual and invisible to the user, the sensations produced by way of haptic feedback make them visceral and real.

wall vibration

glitch wall aesthetics

(back to top)

The Device

This project involved developing a haptic device capable of receiving wireless signals from the simulated world and creating a haptic pulse as a response to the contact with a virtual partition. The artefact had to be:

  • portable and comfortably worn on the wrist and hand,
  • wireless (as in untethered from the computer)
  • able to vibrate for proper feedback to occur
  • light and flexible to accommodate various hand/finger sizes and the finger’s normal range of motion
sketch sketch2

System Boundary

The communication between the simulated world and the user is done through an Arduino Uno microcontroller.

schematics_2

The following images show system diagrams which summarizes the flow of information from the tracked user to the vibrating coin motor built into a flexible finger brace.

Below we can see a systems diagram which summarizes the flow of information from the tracked user to the vibrating coin motor. It begins with the virtual reality side of things. A tracking device sends geospatial information to the sensors, which in-turn compute the data. I am using Unity to generate the virtual partitions.

schematics_1

The computer also controls the Transmitting Arduino board. The transmitting board’s job is to wait for the tracked user to collide with a virtual partition and then send a signal to the Receiving board using nRF24L01+ wireless radio modules.

schematics_4

The receiving board, being completely untethered from the computer, is powered by an external battery pack. This allows the user to move freely in space without the burden of a cable. Once the signal is received, the Arduino activates the vibrating coin motor and this vibration is sensed through the flexible finger brace by the user.

schematics_5

The entire experiment relies on the Unity engine to process the user’s geospatial data and generate the virtual walls. Both Arduinos, as well as Unity, communicate to each-other via a custom C# code that I’ve written for each device.

schematics_3

(back to top)

Photos of Testing and Prototyping

Prototyping (click arrow to expand)

testing

first-prototype

first-prototype2

Arduinos (click arrow to expand)

arduino1

arduino2

arduino3

Finger Brace 3D Printing (click arrow to expand)

brace1

brace2

(back to top)

Experiment 1

Using the first prototype, I filmed 2 users trying out the device. As part of the first test, once a user sensed a haptic pulse, they placed a marker on the floor. As the experiment unfolds, we begin to see the plan of the ghostly architecture reveal itself to us. The users here are the ones making the simulation visible. They act as renderers or even appendices for the simulated environment. The simulation requires the user to run, and the users require it to proceed.

Experiment 1: User 1

Video (click arrow to expand)

Haptic_experiment_1

Experiment 1: User 2

Video (click arrow to expand)

Haptic_experiment_2

Experiment 1: User 3

Video (click arrow to expand)

Haptic_experiment_3

(back to top)

Experiment 2

In order to capture the results of these experiments as static images, I decided to use long-exposure photography which involves using a long-duration shutter speed to blur moving elements and obscure stationary ones. Long-exposure photography captures one thing that conventional photography does not: time.

I added two LEDs to the prototype: a continuously-lit red LED and a blinking blue LED. With the lights turned off, a camera set to long exposure mode will pick up these two light sources and create organic light trails. I chose to integrate a blinking LED in order to get a sense of velocity and acceleration and not just movement. The closer the blue trails are to one another, the slower the movement at that time. The more they are spaced out, the faster the movement.

blinking_LEDs

Given the physical constraints of shooting long-exposure photography, I was limited to 25 second frames. Since the experiment lasted anywhere from 2-4min, I simply merged all the frames onto one another in post production to get the final light patterns.

light_patterns_time

The camera was setup in a top-view orientation and photographed the users performing the experiments. Once the lights were turned off, the LEDs mounted on the artefact would be picked up by the camera set in long-exposure mode.

top-view

My Video

(back to top)

Results

Below are 2 experiments performed in the same location by 2 separate users with all frames overlaid on one another. We can note a few things here. Firstly, user 1 performed the experiment in 2min, while user 2 took almost 4min. This is represented in the intensity of the red trails (meaning that the movement was slower and therefore more light came into contact with the camera’s sensor). Also user 2 spent more time interacting with the virtual partition which is why we can see the layout of the ghostly plan more clearly. In white we can see the layout of the virtual partitions.

The tracking data provided by the HTC Vive Tracker puck installed on the wrist was also recorded during the experiment and mapped out in 3D using a custom Grasshopper script. The points were then assigned coordinates and could be processed further to create stylized digital artefacts.

point_cloud_grasshopper

point_cloud

The above gif shows an axonometric 3D view of the tracked data. The scale of this artefact is about 3m by 3m. The virtual walls sensed by the user are represented as yellow planes, while the ground plane is in purple. The points in yellow are moments where the user felt a vibration when coming into contact with the virtual partitions. The points in red are moments when no vibration was felt.

The ultimate aim of this project was about making visible the system of relationships between the user, the virtual environment and the physical environment as a way to generate questions pertaining to the concept of ghostly architecture and the interplay between virtual and physical space.

Resources Used

Releases

No releases published

Packages

No packages published