Paul G. Allen School of Computer Science & Engineering

CSEP 517 - Natural Language Processing - Winter 2020
Th 6:30-9:20 in CSE2 G10

Teaching Staff

Personnel Contact
Instructor: Luke Zettlemoyer lsz at cs dot washington dot edu
TA: Terra Blevins (blvns at cs dot washington dot edu)
TA: Christopher Clark (csquared at cs dot washington dot edu)
TA: Mandar Joshi (mandar90 at cs dot washington dot edu)
TA: Victor Zhong (vzhong at cs dot washington dot edu)

Approximate Schedule [Subject To Change]


Week Dates Topics & Lecture Slides Notes (Required) Textbook & Recommended Reading
1 Jan 9 Introduction [slides]
Language Models (LMs) [slides]
LM JM 4.1-4; MS 6
2 Jan 16 Hidden Markov Models (HMMs) [slides] HMM JM 4.5-7; MS 6; JM 5.1-5.3; 6.1-6.5; MS 9, 10.1-10.3
3Jan 23
Probabilistic Context Free Grammars (PCFGs) [slides] PCFG
4 Jan 30 Log-Linear Models [slides]; Linear Seqence Models [slides] LogLinear Inside-outside
5 Feb 6 Word Embeddings [slides]; Neural Networks [slides] Feed Forward NNs JMv3 Vector Semantics,
6 Feb 13 Recurrent Neural Networks (RNNs) [slides]
JM Chapter 9
7 Feb 20 Machine Translation (MT) [slides]; Neural MT [slides] IBM Models 1 and 2, Phrase MT Neural MT
8 Feb 27 Contextualized Embeddings: ELMo, BERT, etc. [slides]
9 Mar 5 Frame Semantics [slides]; Coreference Resolution [slides] Frame Semantics
JM 19.4; JM 20.7
10 Mar 12 Question Answering; Langauge Grounding JM 25; MS 13

Textbooks

Assignments, Discussion Board

Available on Canvas.

Grading

The grade will consist of four assignments worth 25% each (containing both written & programming problems). We will also award up to 5% extra credit for participation (in class or on discussion board).

Course Administration and Policies