CSE 447/547M: Natural Language Processing

University of Washington

Winter 2019



The syllabus is subject to change; always get the latest version from the class website.
Website:https://courses.cs.washington.edu/courses/cse447/19wi
Schedule:https://docs.google.com/spreadsheets/d/1qlesNdjSTPakfB9L0j7s7CZO_IbtIsLKO3mSI8JFd4Q/edit#gid=0
Piazza:https://piazza.com/washington/winter2019/cse447
Canvas:https://canvas.uw.edu/courses/1253227
Lectures:MLR 301, Mondays and Wednesdays 3:30–4:50 pm
Quiz:Thursdays 12:30–1:20 pm, SMI 304
Thursdays 1:30–2:20 pm, PAA A110
Thursdays 2:30–3:20 pm, MGH 271
Instructor:Noah A. Smith (nasmith@cs.washington.edu)
Instructor office hours:Mondays and Wednesdays 5–6 pm or by appointment, CSE 532
Email for course staff:cse447-staff@cs.washington.edu
Teaching assistants:Ethan Chau (Fridays 10:30–11:30, CSE 5th floor breakout)
Elizabeth Clark (Thursdays 9–10 am, CSE 021)
Lucy Lin (Mondays 1–2 pm, CSE 007)
Nelson Liu (Tuesdays 3:30–4:30 pm, CSE 220)
Deric Pang (Fridays 2–3 pm, CSE 220)
Kaidi Pei (Thursdays 3:30–4:30 pm, CSE 021)
Final exam:MLR 301, Thursday, March 21, 2:30–4:20 pm


Natural language processing (NLP) seeks to endow computers with the ability to intelligently process human language. NLP components are used in conversational agents and other systems that engage in dialogue with humans, automatic translation between human languages, automatic answering of questions using large text collections, the extraction of structured information from text, tools that help human authors, and many, many more. This course will teach you the fundamental ideas used in key NLP components. It is organized into several parts:

1.
Probabilistic language models, which define probability distributions over text passages.
2.
Text classifiers, which infer attributes of a piece of text by “reading” it.
3.
Sequence models, which transduce sequences into other sequences.
4.
Parsing sentences into syntactic representations.
5.
Semantics, which includes a range of representations of meaning.
6.
Machine translation, which maps text in one language to text in another.

1 Course Plan

This spreadsheet shows the planned lectures, along with readings and assignments. The official textbook for the course is Eisenstein [1]. A secondary text that is likely useful, especially for assignments, is Goldberg [2]. Both of these texts are available online.

2 Evaluation

Students will be evaluated as follows:

Participation points are earned by submitting questions to the TAs in advance of quiz sections, and by posting to the class discussion board.

Late policy: a total of three no-penalty late days may be used for assignments (they may not be used on quizzes or the exam). We round up; if your assignment is 25 hours late, you have used two late days. Once your three late days are used up, late assignments receive zero credit. When feasible (i.e., when you turn them in within three days of the deadline), we will report the grade you would have received on a late assignment, so that you can still learn from it. You are strongly encouraged to complete and turn in all assignments, as this is part of your preparation for the exam.

3 Computing Resources

CSE has reserved the host aziak.cs.washington.edu for you to use for this course. You can also use GPUs; see here for information.

4 Academic Integrity

Read, sign, and upload (via Canvas) the academic integrity policy for this course before turning in any work.

References

[1]    Jacob Eisenstein. Natural Language Processing. 2018. URL https://github.com/jacobeisenstein/gt-nlp-class/blob/master/notes/eisenstein-nlp-notes.pdf.

[2]    Yoav Goldberg. Neural Network Methods for Natural Language Processing. Morgan Claypool, 2017. URL https://www.morganclaypool.com/doi/abs/10.2200/S00762ED1V01Y201703HLT037.