Link Search Menu Expand Document

Empowering Humans in the Age of Machines

UW CSE 599J, Winter 2026
M/W 3pm-04:20pm, ECE 003

Instructor: Yulia Tsvetkov

yuliats@cs.washington.edu

Office Hours: By Appointment

Teaching Assistant: Deniz Nazar

denizn@cs.washington.edu

Office Hours: Fridays 11:00am-12:00pm, Gates 377 https://washington.zoom.us/j/96163624236?jst=2

Teaching Assistant: Kabir Ahuja

kahuja@cs.washington.edu

Office Hours: Wednesdays 11am-12pm Allen 220 https://washington.zoom.us/j/96933449762?jst=2

Announcements

January 12, 2026 — Once you have formed a team please book a presentation slot for Phase 1 here. Due: January 16, 2026 11:59 PM PST.

January 12, 2026 — Team Sign-Up is now open. Please fill out this form to sign up your team of 1-3 students. Due: January 14, 2026 11:59 PM PST.

Summary

Homo sapiens conquered the world not through strength or intelligence alone, but through the stories we told; stories that bound us together, shaped our shared imagined futures, and ultimately expanded the horizons of what we thought was possible. Yet – from the Golem of Prague, to Mary Shelley’s Frankenstein, to HAL 9000 in 2001: A Space Odyssey, to Black Mirror episodes, to recent headlines about fully autonomous vehicles and sentient chatbots – stories have long misrepresented artificial beings as powerful, autonomous, and often perilous creations. Today, as AI is becoming more ubiquitous, these narratives often obscure reality by casting AI systems as agentic and creative forces beyond our control. They mystify scientific innovation, erase human agency, and fuel public anxiety through sensationalist media narratives.

This graduate seminar explores how the stories we tell about artificial intelligence – through novels, films, journalism, and social media – shape how societies imagine technology and themselves. Through critical reading, discussion, and computational analysis, students will investigate how historical and contemporary narratives influence public perception, policy, and ethical debate about AI. They will examine how language and imagery construct ideas of agency and power, and how those representations can amplify hope or fear. Finally, students will experiment with computational social science and narrative analysis methods to analyze or reframe AI stories – developing creative, evidence-based approaches to understanding how technology and storytelling coevolve.

By the end of the course, students will be able to:

  • Critically analyze portrayals of AI in cultural and media contexts.
  • Identify and apply computational social science and narrative analysis methods to online media.
  • Design and execute an interdisciplinary project combining narrative theory and computational analysis.
  • Communicate findings in both scholarly and public-facing formats (e.g., presentations, posters, reports).

Schedule

Calendar is tentative and subject to change. More details will be added as the quarter continues.

Weekly due dates:

  • By Monday 11:59pm: Slides for Wednesday's papers (presenters only)
  • By Saturday 11:59pm: Slides for Monday's papers (presenters only)
Week Date Theme Contents Work due
2 1/12/2026 Introduction Motivation, course overview and requirements. [slides]
1/14/2026 Cancelled
3 1/19/2026 No Class - Happy MLK Day!
1/21/2026 Project Examples Examples of research projects that fit the course's theme.

Resources

  • EdStem. Course communication will be via EdStem.

  • Full Course Handout Doc The detailed syllabus and the course handout doc is available here.

  • Google Drive. Course materials, including lectures, reading lists, etc., are in a Google Drive folder which has been shared with all students. Email Kabir or Deniz in case you cannot access the folder.

Course Structure

The course runs for 10 weeks and meets twice per week in person. It is organized into three conceptual phases. These phases are not rigidly equal in duration but guide the intellectual progression of the seminar.

Phase 1: Artistic and Narrative Inquiry (approximately Weeks 1–5)

Students explore artistic and cultural portrayals of AI across literature, film, television, journalism, and social media. Through close reading, viewing, and discussion, they identify how AI is framed (for example, as savior, threat, tool, trickster, or mirror) and how these portrayals shape perceptions of human agency, creativity, and responsibility.

Deliverables in Phase 1 (reading, analysis, presentation):

  • Selection of one or more narrative sources (e.g., novels, films, series, blog posts, social media threads).
  • A short, non-computational analysis of these sources (themes, tropes, narrative arcs, roles of humans and machines).
  • A preliminary set of research questions or hypotheses about what might be revealed through computational analysis.

Each student or team gives Presentation 1, summarizing their chosen sources, key interpretive insights, and initial hypotheses. Presentations start from week 3; we will share a sign-up sheet for early planning.

Phase 2: Computational Methods and Analytical Framing (approximately Weeks 3–8)

Students survey and discuss computational social science and narrative analysis techniques that could be applied to their selected material. These may include, for example, text mining, topic modeling, sentiment or stance analysis, network analysis, image or video analysis, or other methods suited to their disciplinary background.

The course will provide initial reading lists and examples of methods. Students are expected to:

  • Explore prior computational analysis work relevant to their own research domain (for example, NLP methods for text analysis, computer vision for images or video, or other domain-specific tools).
  • Select a computational strategy that can be realistically implemented within the quarter.
  • Refine their research questions or hypotheses based on methodological possibilities and constraints.

Teams are encouraged to use diverse methods so that students can learn from one another’s approaches. Different teams may analyze similar narrative sources using distinct techniques, or different sources using related techniques.

Each student or team gives Presentation 2, describing their chosen computational strategy, refined research questions, and planned analysis.

Phase 3: Synthesis and Project Work (approximately Weeks 6–10)

In the final phase, students implement their chosen methods on their selected narratives and synthesize qualitative and quantitative insights. The goal is to connect computational findings back to larger questions of how AI is imagined, how human agency is framed, and what social or political consequences narratives may have.

Phase 3 deliverables:

  • Implementation of one or more computational techniques on a chosen dataset or media collection.
  • A poster presentation that visually and concisely communicates the project’s questions, methods, findings, and implications.
  • A final project report written in the style of a research paper (introduction, background, methods, results, discussion, limitations, and future work).

Each student or team gives Presentation 3, focusing on methods, insights gained and open questions. Presentation 3 will be in the poster format or in-class presentations (to be decided later).

Grading

The final grade is composed of:

  • Phase 1 deliverables (Presentation 1 and associated work): 25%
  • Phase 2 deliverables (Presentation 2 and associated work): 25%
  • Phase 3 deliverables (Poster presentation and final project report): 40%
  • Participation and peer feedback: 10%

Assignments will be evaluated on:

  • Clarity and coherence of research questions.
  • Depth and originality of narrative analysis.
  • Appropriateness and rigor of computational methods.
  • Quality of synthesis between qualitative and quantitative perspectives.
  • Clarity and professionalism of written and oral communication.

Course Administration and Policies

  • Late policy. Reasonable extensions may be granted if requested in advance. Work submitted late without prior arrangement may receive a grade reduction, such as a percentage deduction per day, at the instructor’s discretion.

  • Team work policy. Teams of 1–3 students are allowed. Interdisciplinary teams are encouraged. Expectations will scale with team size. Teams should document roles and contributions, and all members are expected to participate meaningfully in the project.

  • Conduct and inclusion. The course follows the University of Washington’s standards for student conduct. All students are expected to contribute to a respectful, inclusive, and intellectually generous seminar environment. Disagreement is welcome.
  • Academic integrity and use of tools. All submitted work must reflect the student’s or team’s own thinking and analysis. External tools, including software and computational resources, may be used as part of the research process. If students use automated tools (for example, for coding assistance, drafting, or analysis), they must:
    • Be transparent about how these tools were used.
    • Take responsibility for verifying and interpreting all outputs.
    • Ensure that the final work reflects their own intellectual contribution.
  • Collaboration.Collaboration within teams is expected. Discussion and informal collaboration across teams is also encouraged, as long as each team’s submitted work is clearly their own and not copied or duplicated.

  • Accessibility. Students who require accommodations are encouraged to contact the instructor as early as possible. The course will make every effort to ensure accessible materials, flexible formats, and equitable participation.
  • Well-being. Graduate study can be intellectually and emotionally demanding. Students are encouraged to maintain a healthy balance, to seek support from university services when needed, and to communicate with the instructor about any challenges that may affect their participation.