What is technology that does not serve us, but is concerned for us?
Much of the technology we encounter in our daily lives takes the form of a service, and of servitude. Our desires are taken as a given, and technology attempts for fulfillment.
What is technology that debates and disputes, rather than fulfills our desires? What is intimate software, created by us, only for us, that debates with our ethical selves?
In this session, let's experiment together with creating ethicsware: software in dialogue with our ethical selves. Using checklists and chatbots as a starting point, we’ll code with Python and use messaging APIs to create ethicsware experiments. We'll talk about ELIZA, a psychotherapist chatbot created in 1966, the influence of classist labor practices onto technology, and the ethics of emotional labor.
What does it mean to create technology that operates on our ethics? What is technology that does not serve us, but is concerned for us? Technologies of care and concern, over service and fulfillment? Let’s find out together!
Our class will be split between discussion and coding. The hope is to dream and speculate on ways in which we can alter our sense of ethics and self through technologies. Some of these technologies will be paper checklists, conversation, or chatbots.
Here are some of the technologies we will use:
- Anaconda (and Jupyter Notebook)
- Pencil / pen
- Structured discussion
- Active listening
- Supportive observations by others
Exercises to prepare for class
Please think of an example in the past when you realized you were doing something in a way that you wanted to change. How did you realize this? How soon after it happened? Did other people help you notice it?
Or, in other words, what were the practices of self-examination that you underwent?
We'll share how we realized in class (without sharing the specifics of our actions), as a way to start some discussion.
Please try one of these implicit bias tests!