CSE 490U: Natural Language Processing

University of Washington

Winter 2017



The syllabus is subject to change; always get the latest version from the class website.
Website:http://courses.cs.washington.edu/courses/cse490u/17wi
Lectures:ARC 160, Mondays, Wednesdays, Fridays 10:30–11:20 am
Quiz:MGH228, Thursdays 9:30–10:20 am; CSE 305, Thursdays 10:30–11:20 am
Instructor:Noah A. Smith (nasmith@cs.washington.edu)
Instructor office hours:CSE 532, Tuesdays and Thursdays 5–6 pm or by appointment
Teaching assistants:Sam Thomson (samt@cs.washington.edu)
Joshua Crowgey (jcrowgey@u.washington.edu)
TA office hours:GUG 407, Mondays 4–5 pm (Joshua) and CSE 218, Wednesdays 12:30–1:30 pm (Sam)
Final exam:ARC 160, Monday, March 13, 8:30–10:20 am


Natural language processing (NLP) seeks to endow computers with the ability to intelligently process human language. NLP components are used in conversational agents and other systems that engage in dialogue with humans, automatic translation between human languages, automatic answering of questions using large text collections, the extraction of structured information from text, tools that help human authors, and many, many more. This course will teach you the fundamental ideas used in key NLP components. It is organized into several parts:

1.
Probabilistic language models, which define probability distributions over text passages.
2.
Text classifiers, which infer attributes of a piece of text by “reading” it.
3.
Sequence models, which transduce sequences into other sequences.
4.
Parsing sentences into syntactic representations.
5.
Semantics, which includes a range of representations of meaning.
6.
Machine translation, which maps text in one language to text in another.

1 Course Plan

datestopic readings deadlines








1/4introduction [1, ch. 1], [2]




1/6–18language models (intro; featurized; neural) [345]; if you want more details on neural nets, see [6]




1/20–23text classifiers [789] A1 due W. 1/18 F. 1/20




1/25–27sequence models (hidden Markov models) [1011]
1/30–2/3…applied to part-of-speech tagging and other problems[1213] A2 due Th. 2/2 F. 2/3




2/6snow day!
2/8–17context-free syntax and parsing [1, ch. 12–14], [14]
pumping lemma covered in quiz section by Sam
2/17dependency syntax
2/22dependency parsing (guest lecture by Sam) [15, ch. 1, 2, 6, and any others that interest you] A3 due M. 2/20




2/24–27predicate-argument semantics [16] A4 due T. 2/28
3/1–3compositional semantics [1, ch. 18], [17]




3/6–8machine translation [1, ch. 25], [1819]




3/10finale A5 due 3/10




The table above shows the planned lectures, along with readings. The official textbook for the course is Jurafsky and Martin [1], but some chapters of the forthcoming third edition are available online [20], so we link to those where appropriate.

2 Evaluation

Students will be evaluated as follows:

Participation points are earned by submitting questions to the TAs in advance of quiz sections, and by posting to the class discussion board.

3 Computing Resources

CSE has reserved the host umnak.cs.washington.edu for you to use for this course.

4 Academic Integrity

Read, sign, and return the academic integrity policy for this course before turning in any work.

References

[1]    Daniel Jurafsky and James H. Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall, second edition, 2008.

[2]    Julia Hirschberg and Christopher D. Manning. Advances in natural language processing. Science, 349(6245):261–266, 2015. URL https://www.sciencemag.org/content/349/6245/261.full.

[3]    Michael Collins. Course notes for COMS w4705: Language modeling, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/lm.pdf.

[4]    Daniel Jurafsky and James H. Martin. N-grams (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/4.pdf.

[5]    Michael Collins. Log-linear models, MEMMs, and CRFs, 2011. URL http://www.cs.columbia.edu/~mcollins/crf.pdf.

[6]    Yoav Goldberg. A primer on neural network models for natural language processing, 2015. URL http://u.cs.biu.ac.il/~yogo/nnlp.pdf.

[7]    Daniel Jurafsky and James H. Martin. Naive Bayes and sentiment classification (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/6.pdf.

[8]    Daniel Jurafsky and James H. Martin. Logistic regression (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/7.pdf.

[9]    Michael Collins. The naive Bayes model, maximum-likelihood estimation, and the EM algorithm, 2011. URL http://www.cs.columbia.edu/~mcollins/em.pdf.

[10]    Daniel Jurafsky and James H. Martin. Hidden Markov models (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/9.pdf.

[11]    Michael Collins. Tagging with hidden Markov models, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/hmms.pdf.

[12]    Daniel Jurafsky and James H. Martin. Part-of-speech tagging (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/10.pdf.

[13]    Daniel Jurafsky and James H. Martin. Information extraction (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/21.pdf.

[14]    Michael Collins. Probabilistic context-free grammars, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/pcfgs.pdf.

[15]    Sandra Kübler, Ryan McDonald, and Joakim Nivre. Dependency Parsing. Synthesis Lectures on Human Language Technologies. Morgan and Claypool, 2009. URL http://www.morganclaypool.com/doi/pdf/10.2200/S00169ED1V01Y200901HLT002.

[16]    Daniel Jurafsky and James H. Martin. Semantic role labeling and argument structure (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/22.pdf.

[17]    Mark Steedman. A very short introduction to CCG, 1996. URL http://www.inf.ed.ac.uk/teaching/courses/nlg/readings/ccgintro.pdf.

[18]    Michael Collins. Statistical machine translation: IBM models 1 and 2, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/ibm12.pdf.

[19]    Michael Collins. Phrase-based translation models, 2013. URL http://www.cs.columbia.edu/~mcollins/pb.pdf.

[20]    Daniel Jurafsky and James H. Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall, third edition, forthcoming. URL https://web.stanford.edu/~jurafsky/slp3/.