The syllabus is subject to change; always get the latest version from the class website.
| |
Website: | http://courses.cs.washington.edu/courses/cse490u/17wi |
Lectures: | ARC 160, Mondays, Wednesdays, Fridays 10:30–11:20 am |
Quiz: | MGH228, Thursdays 9:30–10:20 am; CSE 305, Thursdays 10:30–11:20 am |
Instructor: | Noah A. Smith (nasmith@cs.washington.edu) |
Instructor office hours: | CSE 532, Tuesdays and Thursdays 5–6 pm or by appointment |
Teaching assistants: | Sam Thomson (samt@cs.washington.edu) |
Joshua Crowgey (jcrowgey@u.washington.edu) | |
TA office hours: | GUG 407, Mondays 4–5 pm (Joshua) and CSE 218, Wednesdays 12:30–1:30 pm (Sam) |
Final exam: | ARC 160, Monday, March 13, 8:30–10:20 am |
Natural language processing (NLP) seeks to endow computers with the ability to intelligently process human language. NLP components are used in conversational agents and other systems that engage in dialogue with humans, automatic translation between human languages, automatic answering of questions using large text collections, the extraction of structured information from text, tools that help human authors, and many, many more. This course will teach you the fundamental ideas used in key NLP components. It is organized into several parts:
dates | topic | readings | deadlines |
1/4 | introduction | [1, ch. 1], [2] | |
1/6–18 | language models (intro; featurized; neural) | [3, 4, 5]; if you want more details on neural nets, see [6] | |
1/20–23 | text classifiers | [7, 8, 9] | A1 due W. 1/18 F. 1/20 |
1/25–27 | sequence models (hidden Markov models) | [10, 11] | |
1/30–2/3 | …applied to part-of-speech tagging and other problems | [12, 13] | A2 due Th. 2/2 F. 2/3 |
2/6 | snow day! | ||
2/8–17 | context-free syntax and parsing | [1, ch. 12–14], [14] | |
pumping lemma covered in quiz section by Sam | |||
2/17 | dependency syntax | ||
2/22 | dependency parsing (guest lecture by Sam) | [15, ch. 1, 2, 6, and any others that interest you] | A3 due M. 2/20 |
2/24–27 | predicate-argument semantics | [16] | A4 due T. 2/28 |
3/1–3 | compositional semantics | [1, ch. 18], [17] | |
3/6–8 | machine translation | [1, ch. 25], [18, 19] | |
3/10 | finale | A5 due 3/10 | |
The table above shows the planned lectures, along with readings. The official textbook for the course is Jurafsky and Martin [1], but some chapters of the forthcoming third edition are available online [20], so we link to those where appropriate.
Students will be evaluated as follows:
Participation points are earned by submitting questions to the TAs in advance of quiz sections, and by posting to the class discussion board.
CSE has reserved the host umnak.cs.washington.edu for you to use for this course.
Read, sign, and return the academic integrity policy for this course before turning in any work.
[1] Daniel Jurafsky and James H. Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall, second edition, 2008.
[2] Julia Hirschberg and Christopher D. Manning. Advances in natural language processing. Science, 349(6245):261–266, 2015. URL https://www.sciencemag.org/content/349/6245/261.full.
[3] Michael Collins. Course notes for COMS w4705: Language modeling, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/lm.pdf.
[4] Daniel Jurafsky and James H. Martin. N-grams (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/4.pdf.
[5] Michael Collins. Log-linear models, MEMMs, and CRFs, 2011. URL http://www.cs.columbia.edu/~mcollins/crf.pdf.
[6] Yoav Goldberg. A primer on neural network models for natural language processing, 2015. URL http://u.cs.biu.ac.il/~yogo/nnlp.pdf.
[7] Daniel Jurafsky and James H. Martin. Naive Bayes and sentiment classification (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/6.pdf.
[8] Daniel Jurafsky and James H. Martin. Logistic regression (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/7.pdf.
[9] Michael Collins. The naive Bayes model, maximum-likelihood estimation, and the EM algorithm, 2011. URL http://www.cs.columbia.edu/~mcollins/em.pdf.
[10] Daniel Jurafsky and James H. Martin. Hidden Markov models (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/9.pdf.
[11] Michael Collins. Tagging with hidden Markov models, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/hmms.pdf.
[12] Daniel Jurafsky and James H. Martin. Part-of-speech tagging (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/10.pdf.
[13] Daniel Jurafsky and James H. Martin. Information extraction (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/21.pdf.
[14] Michael Collins. Probabilistic context-free grammars, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/pcfgs.pdf.
[15] Sandra Kübler, Ryan McDonald, and Joakim Nivre. Dependency Parsing. Synthesis Lectures on Human Language Technologies. Morgan and Claypool, 2009. URL http://www.morganclaypool.com/doi/pdf/10.2200/S00169ED1V01Y200901HLT002.
[16] Daniel Jurafsky and James H. Martin. Semantic role labeling and argument structure (draft chapter), 2016. URL https://web.stanford.edu/~jurafsky/slp3/22.pdf.
[17] Mark Steedman. A very short introduction to CCG, 1996. URL http://www.inf.ed.ac.uk/teaching/courses/nlg/readings/ccgintro.pdf.
[18] Michael Collins. Statistical machine translation: IBM models 1 and 2, 2011. URL http://www.cs.columbia.edu/~mcollins/courses/nlp2011/notes/ibm12.pdf.
[19] Michael Collins. Phrase-based translation models, 2013. URL http://www.cs.columbia.edu/~mcollins/pb.pdf.
[20] Daniel Jurafsky and James H. Martin. Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall, third edition, forthcoming. URL https://web.stanford.edu/~jurafsky/slp3/.