Link Search Menu Expand Document

Calendar

The class schedule is detailed below, and is subject to change. We will communicate any changes in class.

All readings are required, unless they are tagged with Optional.

Reading responses are also required, unless explicitly noted.

Reading responses are due by 6pm the day before class.

Milestones are due before 9am on the morning of the class.

Jan 4: Overview and Groundwork

No readings or responses

Jan 9: Ethical Foundations and Tools

Quinn, M. J. (2017). Ethics for the information age. Pearson. (Section 2.1.-2.11)

No reading responses

Jan 11: Ethic Codes

ACM Code of Ethics and Professional Conduct (skim)

“Be Careful What You Code For”, Danah Boyd (2016)

Washington, A. L., & Kuo, R. (2020, January). Whose side are ethics codes on? power, responsibility and the social good. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 230-240).

Optional “Are we having an ethical crisis in computing?” by Moshe Y. Vardi, Communications of the ACM, 62(1), 7-7, 2019.

Optional Bietti, E. (2020, January). From ethics washing to ethics bashing: a view on tech ethics from within moral philosophy. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 210-219).

Jan 16: Idea Fair

Milestone Idea Fair

No readings or responses, but get started on those for Thursday!

Jan 18: Data Politics

Winner, L. (1980). Do artifacts have politics?. Daedalus. (15 pages)

Green, B. (2020). Data science as political action: grounding data science in a politics of justice. Available at SSRN 3658431. (~20 pages)

Parvin, N., & Pollock, A. (2020). Unintended by Design: On the Political Uses of “Unintended Consequences”. Engaging Science, Technology, and Society, 6, 320-327. (7 pages)

Optional Rogaway, Phillip. “The moral character of cryptographic work.” Cryptology ePrint Archive (2015).

Jan 23: Environment

Milestone Project Proposal

“Anatomy of an AI System” by Kate Crawford et al., 2018 (14 pages)

Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. (2019). Green AI. arXiv preprint arXiv:1907.10597. (9 pages)

Optional Borning, A., Friedman, B., & Logler, N. (2020). The ‘invisible’ materiality of information technology. Communications of the ACM, 63(6), 57-64.

Optional Rolnick, D., Donti, P. L., Kaack, L. H., Kochanski, K., Lacoste, A., Sankaran, K., … & Luccioni, A. (2019). Tackling climate change with machine learning. arXiv preprint arXiv:1906.05433.

Optional “Open letter to Jeff Bezos and the Amazon Board of Directors” by Amazon Employees for Climate Justice, 2019

Optional “The Cloud is Not the Territory” by Ingrid Burrington, 2014

Jan 25: Feminism and Power

D’Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press. (Read Introduction and The Power Chapter)

Os Keyes, Josephine Hoy, and Margaret Drouhard. 2019. Human-Computer Insurrection: Notes on an Anarchist HCI. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ‘19). Association for Computing Machinery, New York, NY, USA, Paper 339, 1–13.

Optional “Technically Female: Women, Machines, and Hyperemployment” by Helen Hester, 2016 (10 pages)

Jan 30: Postcolonial Computing and Technological Solutionism

Mohamed, Shakir, Marie-Therese Png, and William Isaac. Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intelligence. Philosophy & Technology 33 (2020): 659-684.

Kentaro Toyama (2015). Geek Heresy: Rescuing Social Change from the Cult of Technology, Chapter 2. Public Affairs.

Listen to this podcast! (no reading response required) The Wubi Effect – a Radio Lab episode on how Chinese characters didn’t fit on a keyboard

Optional Irani, L., Vertesi, J., Dourish, P., Philip, K., & Grinter, R. E. (2010, April). Postcolonial computing: a lens on design and development. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1311-1320).

Optional Goodman, R., Tip, L., & Cavanagh, K. (2021). There’s an app for that: Context, assumptions, possibilities and potential pitfalls in the use of digital technologies to address refugee mental health. Journal of Refugee Studies, 34(2), 2252-2274.

Optional Hong Shen, Cori Faklaris, Haojian Jin, Laura Dabbish, and Jason I. Hong. 2020. ‘I Can’t Even Buy Apples If I Don’t Use Mobile Pay?’: When Mobile Payments Become Infrastructural in China. Proc. ACM Hum.-Comput. Interact. 4, CSCW2.

Feb 1: Race

Milestone Methods Section (or similar)

Benjamin, R. (2019). Race after technology: Abolitionist tools for the new jim code. Social Forces. (read pages 1-17)

Christina Harrington and Tawanna R Dillahunt. 2021. Eliciting Tech Futures Among Black Young Adults: A Case Study of Remote Speculative Co-Design. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI ‘21). Association for Computing Machinery, New York, NY, USA, Article 397, 1–15.

Field, A., Blodgett, S. L., Waseem, Z., & Tsvetkov, Y. (2021). A Survey of Race, Racism, and Anti-Racism in NLP. arXiv preprint arXiv:2106.11410.

Optional Hankerson, D., Marshall, A. R., Booker, J., El Mimouni, H., Walker, I., & Rode, J. A. (2016, May). Does technology have race?. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 473-486).

Optional Ogbonnaya-Ogburu, I. F., Smith, A. D., To, A., & Toyama, K. (2020, April). Critical Race Theory for HCI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-16).

Optional Schlesinger, A., O’Hara, K. P., & Taylor, A. S. (2018, April). Let’s talk about race: Identity, chatbots, and AI. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Feb 6: Exclusive and Evil by Design

“Can you make an AI that isn’t Ableist?” by Shari Trewin, MIT Technology Review

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018, April). The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Optional Gray, C. M., Chivukula, S. S., & Lee, A. (2020, July). What Kind of Work Do” Asshole Designers” Create? Describing Properties of Ethical Concern on Reddit. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (pp. 61-73).

Optional Dark Patterns case study in The ACM Ethics Code.

Feb 8: Governance, Mis-/Disinformation, Platforms, or Publisher?

Guest Speaker Bryan Semaan

Gillespie, T. (2018). Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. (read chapter 1, p. 1 - 24)

“Everything You Need to Know About Section 230”, The Verge, 2020.

Das, Dipto, Carsten Østerlund, and Bryan Semaan. "”Jol” or “Pani”?: How Does Governance Shape a Platform’s Identity?.” Proceedings of the ACM on Human-Computer Interaction 5, no. CSCW2 (2021): 1-25.

Optional Bruckman, Amy. “Should you believe Wikipedia?” Chapter from the forthcoming Should You Believe Wikipedia? from Cambridge University Press. (17 pages)

Optional “Burnout, splinter factions and deleted posts: Unpaid online moderators struggle to manage divided communities” by Heather Kelley, The Washington Post, 2020

Optional “Blue Feed, Red Feed” by Jon Keegan, 2016

Optional “Why Facebook can’t fix itself” by Andrew Marantz, The New Yorker, 2020

Feb 13: Privacy

Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514.

“It’s Not Privacy, and It’s Not Fair” by Cynthia Dwork et al., 2013

“Scroogled” by Cory Doctorow

Optional “Think You’re Discreet Online? Think Again” by Zeynep Tufekci, The New York Times, 2019

Feb 15: Project Fair Round 1

Milestone Project Fair Round 1

No readings or responses, but get started on those for Thursday!

Feb 20

No class, but you should meet with your group to incorporate feedback from Project Fair Round 1

Feb 22: Data Collection and Crowdsourcing

Jo, E. S., & Gebru, T. (2020, January). Lessons from archives: Strategies for collecting sociocultural data in machine learning. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 306-316).

Noopur Raval and Paul Dourish. 2016. Standing Out from the Crowd: Emotional Labor, Body Labor, and Temporal Labor in Ridesharing. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW ‘16). Association for Computing Machinery, New York, NY, USA, 97–107.

Optional Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., … & Horton, J. (2013, February). The future of crowd work. In CSCW 2013.

Optional Barbosa, N. M., & Chen, M. (2019, May). Rehumanized crowdsourcing: a labeling framework addressing bias and ethics in machine learning. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-12).

Feb 27: Classification Bias

Guest Speaker Christina Harrington

Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91).

“Do algorithms reveal sexual orientation or just expose our stereotypes?” by Blaise Aguera y Arcas et al., 2018

Crawford, K. (2019). Regulate facial-recognition technology. Nature, 572(7771), 565-565. (1 page)

Optional “Excavating AI: The Politics of Training Sets for Machine Learning” by Kate Crawford et al., 2019 (14 pages)

Feb 29: Accountability in AI

Diakopoulos, Nicholas, Sorelle Friedler, Marcelo Arenas, Solon Barocas, Michael Hay, Bill Howe, Hosagrahar Visvesvaraya Jagadish et al. Principles for accountable algorithms and a social impact statement for algorithms. FAT/ML (2017).

Madaio, M. A., Stark, L., Wortman Vaughan, J., & Wallach, H. (2020, April). Co-designing checklists to understand organizational challenges and opportunities around fairness in AI. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-14).

Elish, Madeleine Clare. “Moral crumple zones: Cautionary tales in human-robot interaction.” Engaging Science, Technology, and Society 5 (2019): 40-60.

Optional Computer says no: why making AIs fair, accountable and transparent is crucial by Ian Sample, The Guardian, 2017

Optional Solon Barocas, Anhong Guo, Ece Kamar, Jacquelyn Krones, Meredith Ringel Morris, Jennifer Wortman Vaughan, W. Duncan Wadsworth, and Hanna Wallach. 2021. Designing Disaggregated Evaluations of AI Systems: Choices, Considerations, and Tradeoffs. In Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (AIES ‘21). Association for Computing Machinery, New York, NY, USA, 368–378.

March 5: Roles and Responsibilities

Abebe, R., Barocas, S., Kleinberg, J., Levy, K., Raghavan, M., & Robinson, D. G. (2020, January). Roles for computing in social change. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 252-260).

Bruckman, A. (2020). ‘Have you thought about…’ talking about ethical implications of research. Communications of the ACM, 63(9), 38-40.

Optional “Why Stanford Researchers Tried to Create a ‘Gaydar’ Machine”, by Heather Murphy, The New York Times, 2017

March 7: Project Fair Round 2

Milestone Project Fair Round 2

Poster session + Wrap up. No readings or responses

March 11: Final Paper

Milestone Final Paper

Final Paper due. No readings or responses