| The syllabus is subject to change; always get the latest version from the class website.
| |||||||||||||||||||
| Website: | http://courses.cs.washington.edu/courses/cse446/17au | ||||||||||||||||||
| Canvas: | https://canvas.uw.edu/courses/1173938/ | ||||||||||||||||||
| Lectures: | SIG 134, Mondays, Wednesdays, Fridays 9:30–10:20 am | ||||||||||||||||||
| Quiz: | Thursdays, various times and places (check time schedule) | ||||||||||||||||||
| Instructor: | Noah A. Smith | ||||||||||||||||||
| Instructor office hours: | CSE 532, Mondays 12–1 pm and Wednesdays 5–6 pm or by appointment | ||||||||||||||||||
| Teaching assistants: | Kousuke Ariga | ||||||||||||||||||
| John Kaltenbach | |||||||||||||||||||
| Deric Pang | |||||||||||||||||||
| Patrick Spieker | |||||||||||||||||||
| Qiang Andrew Yu | |||||||||||||||||||
| Jane Jianyang Zhang | |||||||||||||||||||
| TA office hours: |
|
||||||||||||||||||
| Final exam: | tentatively Wednesday, December 13, 8:30–10:20 am | ||||||||||||||||||
| Email course staff: | cse446-staff@cs.washington.edu | ||||||||||||||||||
Official catalogue description: Methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling: decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering.
| date | topic and readings (chapter in [1]) | deadlines | |||
| (week 1) | 9/27 | introduction | project part 1 released | ||
| 9/28 | quiz: Python review | ||||
| 9/29 | decision trees | 1 | A1 released | ||
| 10/2 | (continued) | ||||
| (week 2) | 10/4 | limits of learning | 2 | ||
| 10/5 | quiz: linear algebra review [tarball] | ||||
| 10/6 | geometry and nearest neighbors | 3 | |||
| 10/9 | perceptron | 4 | |||
| (week 3) | 10/11 | (continued) | |||
| 10/12 | quiz review [tarball] | A1 due Th. 10/12 | |||
| 10/13 | unsupervised learning: K-means++ (guest lecture by Swabha Swayamdipta) | 15 | |||
| 10/16 | unsupervised learning: principal components analysis | dataset due Tu. 10/17; A2 released | |||
| (week 4) | 10/18 | practical issues: features, performance | 5 | project part 2 released | |
| 10/19 | quiz: dataset discussion | ||||
| 10/20 | practical issues: performance, bias/variance | ||||
| 10/23 | learning as minimizing loss | 7 | A2 due Tu. 10/24 | ||
| (week 5) | 10/25 | and specifically, log loss | |||
| 10/26 | quiz: loss and related topics | ||||
| 10/27 | optimization algorithms; regularizers | A3 released | |||
| 10/30 | probabilistic models: logistic and linear regression | 9 | project part 2 due M. 10/30 | ||
| (week 6) | 11/1 | MLE and MAP | |||
| 11/2 | quiz: probability review | ||||
| 11/3 | probabilistic generative models and naïve Bayes | official datasets announced | |||
| A3 due Su. 11/5 | |||||
| 11/6 | neural networks | 10 | A4 released; project part 3 released | ||
| (week 7) | 11/8 | neural networks: backpropagation | |||
| 11/9 | quiz TBD | ||||
| 11/10 | holiday | ||||
| 11/13 | bias and fairness | 8 | A4 due Tu. 11/14 | ||
| 11/15 | (continued); kernel methods | 11 | |||
| 11/16 | quiz TBD | ||||
| (week 8) | 11/17 | support vector machines | 7.7 | ||
| 11/20 | (continued); beyond binary classification | 11.6, 6 | |||
| 11/22 | “office hour” | ||||
| 11/23 | holiday | ||||
| 11/24 | holiday | ||||
| (week 9) | 11/27 | learning theory | 12 | A5 due Tu. 11/28 | |
| 11/29 | ensemble methods | 13 | |||
| 11/30 | quiz TBD | ||||
| 12/1 | (continued) | ||||
| (week 10) | 12/4 | expectation-maximization | 16 | project due Tu. 12/5 | |
| 12/6 | probablistic graphical models | [2] | |||
| 12/7 | quiz TBD | ||||
| 12/8 | finale | ||||
| 12/13 | exam (8:30 am–10:20 am) | ||||
The table above shows the planned lectures, along with readings. The official textbook for the course is Daume [1], which is available online at no charge (http://ciml.info).
Students will be evaluated as follows:
Read, sign, and return the academic integrity policy for this course before turning in any work.
[1] Hal Daume. A Course in Machine Learning (v0.9). Self-published at http://ciml.info/, 2017.
[2] Daphne Koller, Nir Friedman, Lise Getoor, and Ben Taskar. Graphical models in a nutshell, 2007. URL https://ai.stanford.edu/~koller/Papers/Koller+al:SRL07.pdf.