Ethical Issues Surrounding Artificial Intelligence Systems and Big Data
- Instructors: Su Lin Blodgett (firstname.lastname@example.org), Abe Handler (email@example.com), Katie Keith (firstname.lastname@example.org)
- Semester: Fall 2018
- Course Description
- Schedule and readings
This course will consider some large questions surrounding ethics in artificial intelligence systems. These questions include:
Do computers make decisions in a way that is more fair and less biased than people?
What are the political, legal, social, economic and technological forces that govern the digital world? Which forces are in opposition? Which forces are aligned?
What role does the government currently play in directing AI technology in the United States? What role could the government play? What are the advantages and disadvantages of different approaches?
Readings and videos listed under each week are to be done before that week's class.
Week 1: Inspiration
Week 2: Machine Learning Foundations
Machine Learning and Human Bias. Google Video.
AI can be sexist and racist — it’s time to make it fair. Zou and Schiebinger. Nature.
Introduction, Weapons of Math Destruction. Cathy O'Neil. Check Moodle for the week's reading.
Week 3: Ethical Intuitions
Advances in AI are used to spot signs of sexuality. The Economist.
Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine. New York Times.
Challenge: Author's Note, Kosinski and Wang
Week 4: Ethical Foundations
Social Philosophy: Marx, Rawls and Nozick. Donald Palmer. Does the Center Hold?
Week 5: Fairness
Machine Bias. Angwin, Larson, Mattu and Kirchner. ProPublica.
A computer program used for bail and sentencing decisions was labeled biased against blacks. It’s actually not that clear. Corbett-Davies, Pierson, Feller, and Goel. The Washington Post.
Challenge: Even Imperfect Algorithms Can Improve the Criminal Justice System. Corbett-Davies, Goel, and Gonzalez-Bailon. The New York Times.
Week 6: Fairness
Semantics derived automatically from language corpora contain human-like biases. Caliskan, Bryson, and Narayanan. Science.
How Vector Space Mathematics Reveals the Hidden Sexism in Language. MIT Technology Review.
Challenge: Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. Bolukbasi, Chang, Zou, Saligrama, and Kalai. NeurIPS 2016.
Week 7: Fairness
'Three black teenagers' Google search sparks outrage. Guynn. USA Today.
Racism is Poisoning Online Ad Delivery, Says Harvard Professor. MIT Technology Review.
Facebook Lets Advertisers Exclude Users by Race. Angwin and Parris. ProPublica.
Week 8: Privacy
Amazon’s facial recognition matched 28 members of Congress to criminal mugshots. Levin. The Guardian.
Want to Predict the Future of Surveillance? Ask Poor Communities. Eubanks. The American Prospect.
Challenge: Face Off: Law Enforcement Use Of Face Recognition Technology. Lynch. Electronic Frontier Foundation.
Week 9: Privacy
Amazon Echo and the Hot Tub Murder. Dotan and Albergotti. The Information.
Challenge: Simple Demographics Often Identify People Uniquely. Sweeney. Health.
Preview of reading (in class): TapPrints: Your Finger Taps Have Fingerprints. Miluzzo, Varshavsky, Balakrishnan, Choudhury. MobiSys.
Week 10: Automated Decision-Making and Interpretability
The Dark Secret at the Heart of AI. Knight. MIT Technology Review.
When Is It Important for an Algorithm to Explain Itself? Hume. Harvard Business Review.
Week 11: Accountability and Regulation
The next big battle over internet freedom is here. Stewart. Vox.
What the government could actually do about Facebook. Stewart. Vox.
Week 12: Diversity
The Tech Industry’s Gender-Discrimination Problem. Kolhatkar. The New Yorker.
The Real Reason Women Quit Tech (and How to Address It). Thomas. Medium.
Additional readings that touch on issues other than gender:
Why Is Science So Straight?. Suri. The New York Times.
STEM is losing male LGBQ undergrads. Langin. Science.
When disability tech is just a marketing exercise. Eveleth. The Outline.
Why the tech industry needs people with disabilities — and vice versa. Bedford. The Ground Truth Project.
Week 13: AI for Social Good and Reflection
Your grade will be based equally on the following:
- 50% Online reading responses. Each week in this course you will write an online response to the readings on Moodle. You must turn in your response before class to receive credit for the week.
- 50% Participation during class discussions. This class will have weekly in-class discussions. Different students may participate in different ways: for instance by talking in large groups, talking in small groups or listening carefully to others.
The following guidelines will create a comfortable and productive learning environment throughout the semester.
You can expect us:
- To start and end class on time
- To reply to emails within 24 hours on weekdays and 48 hours on weekends
- To assign readings and class activities that will foster engaging discussion
We can expect you:
- To come to class on time
- Since we will be discussing complex issues, to come with an open mind
- To assume the best of others' intentions in discussion
- To be respectful of others' viewpoints
- As much as possible, back up comments with evidence, argumentation, or recognition of inherent tradeoffs