Meta-Module #4: Annotating and Crowdsourcing Digital Projects
This "meta-module" offers an overview of digital tools and resources for annotation and crowdsourcing, as well as issues related to collaborative writing and transcription in the context of digital scholarship projects.
Estimated Completion Time = 3 hours
- Increased understanding of the pedagogical applications for annotation and close reading
- Improved knowledge of issues related to digital project crowdsourcing
- Increased familiarity with tools and platforms for collaborative writing, annotation, and transcription
Two key outcomes for digital scholarship projects are their use within the classroom and their engagement of various public audiences, such as communities who have interest in or ownership of a project's materials. On the one hand, many digital tools have direct pedagogical applications, particularly in higher ed. These include platforms that allow students and instructors to engage in asynchronous conversation on objects of study through collaborative annotation, commenting, and note-taking. On the other hand, many projects have crowdsourcing components that require community participation for the creation and evaluation of project materials or data.
This module introduces a wide variety of digital tools and platforms for text- and image-based annotation and transcription. While few digital scholarship projects incorporate both of these elements together, annotation and transcription can be thought of as two sides to the same coin: both promote close, communal engagement with either born-digital or digitized media, with the goal of collaboratively marking up texts or images for additional study or to link the materials to existing collections or datasets.
The completion of this module does not necessitate extended interaction with all of the tools and platforms introduced in various readings and activities, but the module is designed to give librarians a thorough overview of the current platforms that are available, along with their general affordances, applications, and drawbacks.
Complete SSRC Module 4: Annotating Digital Collections
While ThingLink offers a free trial version for educators, the free and open-source web publishing platform Scalar has built-in image annotation functionality. Read about Scalar's annotation capabilities here.
- Bailey, Julia. 2008. "First steps in qualitative data analysis: transcribing." Family Practice 25 (2): 127–131. https://doi.org/10.1093/fampra/cmn003 ◊ Estimated Read Time = 12 minutes
- Kennedy, Meegan. 2016. "Open Annotation and Close Reading the Victorian Text: Using Hypothes.is with Students." Journal of Victorian Culture 21 (4): 550-558. https://doi.org/10.1080/13555502.2016.1233905 ◊ Estimated Read Time = 20 minutes
- Wieck, Lindsey Passenger. February 18, 2019. "Annotating Readings with Hypothesis." Pedagogy Playground. http://pedagogyplayground.com/methods-toolbox/annotating-readings-with-hypothesis/ ◊ Estimated Read Time = 7 minutes
Review the SSRC Resource Sheet: Crowdsourcing Tools
For each the three tools introduced - Crowdcrafting, Scripto, and Scribe - explore at least one example project built using the tool and consider the following questions:
- Is this platform still functional? If not, can you tell what happened to projects built with the tool? (Click here for a hint.)
- What web publishing platforms and media formats is this tool compatible with?
- Does the tool have particular infrastructure requirements?
- What is the transcription workflow?
- What kinds of projects have utilized this tool? What are the affordances of this tool for particular types of research questions or disciplines (i.e. might it have different utility for the humanities, STEM fields, or social sciences)?
Explore and evaluate Transcribe Bentham
As you have done for the "Project Lens" activities in some of the SSRC modules, take a few minutes to browse through the Transcribe Bentham blog and transcription interface, considering the following questions:
- What is the purpose of this project? What are the goals? Who created this project and who is it for? How do you find out?
"Meta" Questions to Consider
- Do you see any potential ethical or technical issues that might impact the use of annotation tools in the classroom, related to student privacy or platform interoperability? How might you advise instructors on the use of annotation tools with these potential issues in mind?
- Are some annotation or transcription tool providers more transparent than others about user data collection and privacy?
- Based on your exploration of digital projects with text or handwriting transcription components as well as the theoretical framework for audio transcription discussed in the Bailey article - do you think text and audio transcription projects face similar challenges, or are there issues unique to different kinds of transcription? What challenges might a scholar face in developing an audio-transcription crowdsourcing project?
- Take a few minutes and try to articulate what you will take away from the readings, activities, and resources covered in this module. What is one concept that you feel you now understand better? One topic that was completely new to you? One question you would like to explore further?
General web resources
- Annotation Tools: A Resource for College Instructors
- Paul Schacht, "Annotation", keyword in Digital Pedagogy in the Humanities: Concepts, Models, and Experiments
- Transcription, on Tinker
- American Hisorical Association's list of Crowd Transcription Projects
- Smithsonian Digital Volunteers: Transcription Center
- Bossewitch, Jonah, and Michael D. Preston. 2011. "Teaching and Learning with Video Annotations," in Learning Through Digital Media: Experiments in Technology and Pedagogy, edited by R. Trebor Scholz. New York: Institute for Distributed Creativity, 175-183. https://doi.org/10.7916/D8TB1H9J
- Causer, Tim, Justin Tonra, and Valerie Wallace. 2012. "Transcription maximized; expense minimized? Crowdsourcing and editing The Collected Works of Jeremy Bentham." Literary and Linguistic Computing 27 (2):119–137. https://doi.org/10.1093/llc/fqs004
- Terras, Melissa. 2016. "Crowdsourcing in the Digital Humanities," in A New Companion to Digital Humanities, edited by Susan Schreibman, Ray Siemens, and John Unsworth. Malden, MA: Wiley-Blackwell, 420-439. https://hcommons.org/deposits/item/hc:15065/
- Tracy, Daniel G. 2016. "Assessing Digital Humanities Tools: Use of Scalar at a Research University." portal: Libraries and the Academy 16 (1): 163-189. Johns Hopkins University Press. https://doi.org/10.1353/pla.2016.0004
Hint: As of April 1, 2019, Scifabric discontinued support for Crowdcrafting and appears to have discontinued hosting services as well, meaning legacy projects are no longer available to view online. An archived version of the example project discussed in the SSRC Resource Sheet, Crime, Sex, and Violence, is available via the Internet Archive's Wayback Machine, although the transcription interface supported by Crowdcrafting is no longer available. Explore the capabilities and projects on Zooniverse.org as an alternative platform.