Clone this wiki locally
This text is about the conservation of software-based artworks, but it's important to say is that I'm not a conservator — I'm an artist who has been exhibiting software for about twenty years, depending on where we start counting. What I know about conservation is practical knowledge hard-won through direct experience combined with ideas gleaned through experts' texts on the subject, as well as opinions and technical information gained through discussions over the years with conservators, galleries, and institutions.
It feels obvious to write that every medium, including software, has its advantages and disadvantages in relation to conservation. However, I feel there is such little understanding around software-based art in 2016 that many conservators imagine only the disadvantages. In researching the field to write this document, I was heartened to be reminded of the ambitious work of pioneering curators and conservators (see the Bibliography) and to learn more about emerging institutional initiatives.
This document is focused on my own work, which occupies one narrow category within the larger concerns of conservation, but it also introduces some of the general issues on the topic as I understand them in order to put my own software-based works in context. For a larger view of media art conservation from the point of view of an artist, see the excellent "Best practices for conservation of media art from an artist’s perspective" by Rafael Lozano-Hemmer (@antimodular).
Software-based works are among the most archival in the visual arts, but knowledge in this domain is rarified and scattered. Software-based works are fragile in relation to the technologies required to support them, but they are robust because they aren't embedded into a specific material form like a drawing or sculpture. As introduced by Jon Ippolito in his co-authored book Re-collection: Art, New Media, and Social Memory, works in software can be emulated, migrated, or reinterpreted to avoid the issues that cause more essentially physical forms of art to degrade over time. He defines the most common conservation technique as storage, "Storage captures matter and puts it in a box, on a shelf, under glass, in a climate-controlled vault deep in a mountain. There, stored culture waits in a form a suspended animation..." Storage is a poor technique for software-based work, but the other options of emulation, migration, and reinterpretation are powerful alternatives.
Emulation is technique where one computer impersonates another. For example, a MacBook Pro in 2016 behaving like a Commodore 64 from 1982. Using this example, a program written in 1982 for a Commodore 64 can look and feel nearly identical to how it did then. Because new machines are more powerful than the older machines, the older software can be entirely simulated within the new machine with high fidelity. This is a technique the Museum of Modern Art uses to display some of the software-based works in its collection.
Reinterpretation is the most extreme, but powerful conservation technique. In the words of Ippolito, "A reinterpretation sacrifices basic aspects of the work's appearance in order to retain the original spirit. Rare for the fine arts, reinterpretation is common in dance and theater..." While Ippolito provides an example for reinterpretation as porting code from one language to another, I feel this is migration, but it might be the extent of the changes the language shift causes that would place it in one category instead of the other. I imagine reinterpretation as a possible necessity if the original source code isn't available, but a documentation video exists as a reference. Reinterpretation might also be the preferred action defined by an artist to keep a work contemporary, rather than utilizing older hardware and software paradigms.
To make the larger point, Ippolito compares the work of Eva Hesse to Sol LeWitt. It's an engaging and thoughtful comparison that is better to read in full, but the conclusion is that while some of Hesse's work has materially deteriorated to the point where it is no longer as it was intended to be, LeWitt's drawings are continuously painted over and painted again with each new instance as original as the last. The essence of LeWitt's work was a flexible set of instructions, the work was defined as information. In contrast, Hesse's work was a specific object or installation, the work was defined as a precise organization of materials. A specific LeWitt drawing isn't durable, but the system for creating new drawings from the instructions makes the total work extremely robust. This is just one comparison and it's a step or two removed from the conservation of software-based works, but it's a provocative analysis.
When examining a software-based work within the frames of emulation, migration, and reinterpretation, how are decisions made? I think it's best for the direction to be indicated by the artist. For example, some works require a fixed dimension that lend themselves more to emulation, while others are more flexible with resolution so that migration is more ideal. For other works, such as my own Process series of work, the point of the work is to be interpreted in unique ways, so reinterpretation is ideal. If the artist doesn't state a preference or it's not implied through the documentation, it's the conservator's choice.
Free software, where the word free refers to freedom, is the foundation of software-based art conservation. The often-used term open source means something similar, but the contentious differences between the two are a topic for another text. The acronyms FOSS (free, open source software) and FLOSS (free, libre, open source software) are frequently used as a compromise between the two points of view and I'll use FOSS here to refer to the idea in common, that the source code for the software is available. The points of view that distinguish the two terms are important, but pragmatically, both free and open source software, as well as open standards, are essential for maintaining a work over a long period of time.
I've heard horrifying anecdotes from colleagues who have used proprietary software in their work. If a software product is discontinued or a company disappears completely, aging software tools can become impossible to run as the technologies surrounding them move forward. Proprietary (or closed source) software is in conflict with conservation — it's a black box that cannot be opened. With proprietary software, the artist and the collector have no agency in decisions about the conservation of an artwork.
Open source software projects are also abandoned, but the open source community is more robust and flexible than failed or discontinued proprietary software. When a large groups of people are using and contributing to a FOSS project, it can have a long lifespan and as experience shows, it can bounce back from changes in leadership. With an open source project, it's not always practical to keep it going, but it's always possible. In contrast, it's typically impossible to keep close-source software going after its discontinued by its developers.
Since 2003, the majority my software has been written with Processing, a widely used FOSS programming platform for the visual arts. (Prior to that, it was written in C++.) As the co-creator of Processing with Ben Fry, I am involved with decisions about the future of the language and we are committed to maintaining it. We founded the Processing Foundation, a 501(c)(3) organization, in 2012 to confirm this commitment and to build an infrastructure for its preservation. Software written with Processing is essentially cross platform. It can be compiled to run on Mac, Windows, and Linux operating systems.
The complete Processing language reference is documented online and is included with each downloaded version of the software. Additionally, it is documented in print in more than a dozen books. I'm the co-author of the comprehensive book Processing: A Programming Handbook for Visual Artists and Designers, published by MIT Press first in 2007 and with a second edition in 2014. The Processing language reference and these publications explain the language components of the source code for my software-based work. These resources will allow the source code to be legible far into the future and they are a guide to migrating the work to future platforms.
When a work is acquired, the complete source code for Processing is included along with the source code for the work. The source code for Processing is public and open source. Processing does rely on Java, which is admittedly volatile and outside of our control. However, Java is one of the most widely used programming languages in academia and industry so there are many reasons to believe the language will continue to be used and there will be expertise surrounding it for decades to come.
The Source Code is not "The Work"
Installing Software in a Gallery
If a software-based work is created for a web browser, a phone, or a gaming platform, that work doesn't have physical installation specifications for a gallery space; it's viewed through those devices. The majority of my work, however, requires a careful physical installation that includes a screen or projector, a computer, and wires for power and video signals. The physical installation in the space must be meticulous and it is therefore challenging. For example, for the physical installation of Casey Reas Love Los Angeles, my contribution to the collaborative Textile Room, the software involves a detailed projection mapping onto the built structure. Mounting the projectors, obscuring the wiring, and then calibrating the software to match the architecture is exacting and difficult work. Most of the installations of my work are less elaborate. For example, a work like KTTV is displayed on horizontal screen so the installation is simpler, but the details are extremely important. The position of the screen, the make and model of the screen, the light in the room, the location of the computer (hidden from sight), and the computer specifications are attributes of the work that must be done according to the installation guidelines for each work. These guidelines specify dimensions, equipment details, and other installation notes like wall color and light levels.
My approach to working with software is to maximize the advantages of working with the medium and to minimize the disadvantages. What does this mean? In essence, it's using widely available hardware to display and run the work, using open-source software tools and formats as much as possible, and being meticulous in documenting these decisions. The priority is to keep the technology as minimal and simple as possible. This is in contrast to making custom electronics, using rarified hardware, and working with closed-source and proprietary software.
The only thing obstructing the preservation of a work of software-based art is caring enough about a particular piece of work to conserve it. The techniques mentioned here define a range of considerations and options for ensuring a work can be viewed in perpetuity.
- Engel, Deena and Glenn Wharton. "Reading between the lines: Source code documentation as a conservation strategy for software-based art." Studies in Conservation, Vol. 59 No. 6, 2014. Presents two case studies from the Museum of Modern Art, one created with Processing.
- Fino-Radin, Ben. Digital Preservation Practices and the Rhizome Artbase. 2011. http://media.rhizome.org/artbase/documents/Digital-Preservation-Practices-and-the-Rhizome-ArtBase.pdf Text written by Fino-Radin while he was Digital Conservator at Rhizome at The New Museum.
- Fino-Radin, Ben. "MoMA’s Digital Art Vault." April 15, 2015. Post on MoMA/PS1 INSIDE/OUT blog
- Graham, Beryl, ed. New Collecting: Exhibiting and Audiences after New Media Art. Ashgate, 2014.
- Hoffmann, Allison K. Software-based Art: Challenges and Strategies for Museum Collections. https://digital.lib.washington.edu/researchworks/bitstream/handle/1773/23522/Hoffmann_washington_0250O_12044.pdf 2013 MA Thesis at the University of Washington.
- Lewis, Kate. "What Does a Media Conservator Do?" March 24, 2015. Post on MoMA/PS1 INSIDE/OUT blog
- McNeill, John. "Art & Media—Software-Based Art Scenario." http://pericles-project.eu/bresources/post/software-based%20art,%20scenario. Blog post about software-based art acquisitions since 2003 at the Tate.
- Rinehart, Richard and Jon Ippolito. Re-Collection: Art, New Media, and Social Memory. The MIT Press, 2014.
- Interviews from The Smithsonian Interview Project: Questions on Technical Standards in the Care of Time-Based and Digital Art, Ten Insights from Artists and Experts in the Field. July 2014. Produced by the Smithsonian’s Time Based Media and Digital Art Working Group and the Smithsonian Office of Policy and Analysis. http://www.si.edu/content/tbma/documents/SI_TBMA_10_Insights.pdf
- Videos from TechFocus III: Caring for Software-based Art. Solomon R. Guggenheim Museum, New York. September 25-26, 2015. http://resources.conservation-us.org/techfocus/techfocus-iii-caring-for-computer-based-art-software-tw/
- Videos from Technology Experiments In Art: Conserving Software-Based Artworks. National Portrait Gallery and Smithsonian American Art Museum, Donald W. Reynolds Center, Washington, DC. January 17, 2014. http://www.si.edu/tbma/symposiums
- Rhizome's Digital Preservation and Digital Conservation Program http://rhizome.org/
- Time-Based Media Art at the Smithsonian Institution http://www.si.edu/tbma/
- Matters in Media Art, a collaboration between the New Art Trust (NAT), the Museum of Modern Art (MoMA), the San Francisco Museum of Modern Art (SFMOMA) and Tate http://www.tate.org.uk/about/projects/matters-media-art
- Rose Goldsen Archive of New Media Art at Cornell University http://goldsen.library.cornell.edu/
- Library of Congress Digital Preservation http://blogs.loc.gov/digitalpreservation/
- American Institute for Conservation Electronic Media Group (AIC EMG) http://cool.conservation-us.org/coolaic/sg/emg/library/index.html