Skip to content

RobHallArt/Workplace-Anxiety

Repository files navigation

Workplace-Anxiety

Code to control a fleet of arduino based machines based on presence of users detected by an XBOX Kinect V2 Sensor

IDEATION

In this project I use physical computing devices such as Arduino boards, integrated circuits and servos, to build a fleet of machines that will inhabit traditional work spaces. Machines are taking over from human in jobs that it was thought were impossible to automate. This creep of digital automation, takes over jobs without physically manifesting itself in the workplace. The machines are a reminder that automation is still happening.

The machines are presented in a social state, as a fleet, appearing to converse. This changes how they are perceived; by displaying the machines in this way introduces them to a space in the workplace normally an escape for humans. The machines converse in their own language, impenetrable to humans, which excludes the human from a previously exclusive social environment. I imagined that the machines were taking over in a similar fashion to a hostile takeover of a company. The machines will cease conversations when they detect a human nearby, and turn to face the interloper with accusative, piercing, red eyes. This last part was always important to me as it creates a sudden unease in the audience.

When conversing, the machines diminutive size and squeaky 'voices' could make them seem cute, especially on their own. I attempted to counter this by creating a design that was somewhat industrial, using metal bolts to hold the machines and their frames together to give an industrial feel. Bare materials play a part too, exposed circuitry and burned wood from laser cutting. Early on I printed head assemblies using coloured 3D printer filament, but added a black fascia attached with bolts, to mitigate this. The machines really become sinister when they are part of a fleet. One machine tracking you with it's eyes as you move around it is not particularly effective, but seven machines acting independently is overwhelming.

All along I had the scenario in my head of conversations stifled by the arrival of the subject of such conversation. The sense of exclusion was the foundation of the project as that is exactly how I see the takeover of machines manifesting itself.

TECHNICAL

This project has taught me a huge amount about physical computing. I have tested, ruled out and selected various solutions and techniques for almost every part of the machines and their build process. I learned lessons ranging from how to build a serial communication protocol to the disadvantages of ordering parts from china.

I started my project by building a simple demo in processing that took an input from an XBOX Kinect sensor and used it to draw the space in front of the sensor as a top down plan of the space with people in the space marked as points derived from the X and Z values of the pelvic point from the Kinect skeleton. The sketch also featured a pointed shape that rotated to point always at the person the Kinect was tracking. This small piece of code became the foundation for most of this project, but it was only the beginning.

The next step then was to communicate the coordinates of the person in the room to each of the machines, as I always knew I wanted a fleet. My original idea was to have the machines communicate using sound, I quickly gave up on this however as I could foresee difficulty building a protocol that would prevent machines interfering with one another. I did however know that the machines needed to be wireless, in order to appear to be acting autonomously.

My research first brought me a set of generic 433 Mhz Transmitter and Receiver boards. While low cost, these boards were riddled with flaws from the outset, as the Arduino library that controls them conflicts with that which controls servos. This coupled with interference issues and general lack of support forced me to move on. My next option was WiFi communication using 'esp8266' boards, which in theory offered numerous benefits. My research told me the boards could be programmed using the default Arduino language and Integrated Development Environment (IDE), that they could be meshed together offering a very low latency and very reliable system. In reality however they were very difficult to program because they require a separate programmer and also do not conform to standard pin spacing conventions meaning every board would require ribbon cables to be soldered to them. I finally settled on the HC-12 board for communication as it is widely supported and allows communication over serial. The softwareSerial library essentially allowed me to write serial commands as I had when communicating from processing to Arduino over USB and have them sent to as many other Arduinos as I liked.

The next problem came in writing a protocol that was resistant to the odd interruption; I needed to build in error correction. This was completely new territory for me as it was the first time I had programmed wireless communication without the backbone of an library such as webSocket with built in error checking and correction. I ended up checking the message by making sure every update was fixed length. For example rather than sending just '25' for an X position of 25, I would send '025'. This worked because interference manifested itself in the communication far more often as dropped characters than incomplete or corrupted characters. Any time three digits for any value were not received the whole message was discarded.

Another challenge I faced was that not all of the pins on the Arduino are the same, and despite some being marked as allowing Pulse Width Modulation (PWM), there seemed to be no connection between this and which pins worked for what task. After much trial and error I had a plan for a pin layout that allowed all the devices connected to each Arduino to function properly. I ended up with 2 pins for the HC-12, one pin for the Servo and three PWM pins for the Red, Green and Blue pins of the LEDs for the eyes.

My choice of LEDs was a design choice I didn't take lightly. I knew I wanted LEDs as large and bright as possible, but I couldn't decide between cloudy and clear plastic for the top of them. They had vastly different effects. The cloudy LEDs produced diffuse light and almost appeared to just glow with the selected. I devised a test whereby I programmed an Arduino to strobe an array of LEDs, which I would then stare directly into, simulating eye contact with the machine. I very quickly determined that I could easily look at the cloudy LEDs without any effects, the clear LEDs however were piercing, with the curved plastic focusing the light into a point at the front of the LED. My choice was clear.

For power I used 18650 type rechargeable Li-Po Batteries connected to the Vin pin of the Arduino through a step up voltage regulator that adapts the P.D from 3.7v to 5v. This proved to be a surprisingly inexpensive solution as power had been one area where I had expected to spend more money on the project.

CONSTRUCTION

Once I had completed a circuit diagram in fritzing, I began to solder together boards with female pins for the Arduino and HC-12 and male pins for the servo and LED wires as I wanted to be able to salvage the expensive parts at a later date. My first few attempts were a mess, and while functional did not use space on the board efficiently. As my soldering improved, so did my planning of where to route wires and solder connections so as to use as little wire as possible with minimal overlap. Over the iterations this caused the design of the board to become standardised.

The head elements were 3D printed to fit the LED boards I had created. Measurements were taken of the servo mount and the LED board using calipers and the design was created in Fusion 3D. Three mounts were printed before one was a good fit for both the LED assembly and the servo.

Next came the Base. I had thought of 3D printing this too but it turned out to be much easier to use the laser cutter. I made a prototype in cardboard before beginning to print the real thing in wood. It took 5 prints of 10 minutes for all the parts to be finished.

The design for both the head assembly and the base was utilitarian. I decided this was achievable given my level of proficiency with Fusion 3D, it also reflects the stark, functional nature of the code the machines represent. A later choice was also made to add a fascia to the front of the head assembly, this played two purposes, it was printed in black so as to mask the coloured material of the actual head assembly, and also provided screw holes through which the LED assembly is attached permanently.

About

Code to control a fleet of arduino based machines based on presence of users detected by an XBOX Kinect V2 Sensor

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published