CSE 446: Machine Learning

University of Washington

Autumn 2017



The syllabus is subject to change; always get the latest version from the class website.
Website:http://courses.cs.washington.edu/courses/cse446/17au
Canvas:https://canvas.uw.edu/courses/1173938/
Lectures:SIG 134, Mondays, Wednesdays, Fridays 9:30–10:20 am
Quiz:Thursdays, various times and places (check time schedule)
Instructor:Noah A. Smith
Instructor office hours:CSE 532, Mondays 12–1 pm and Wednesdays 5–6 pm or by appointment
Teaching assistants:Kousuke Ariga
John Kaltenbach
Deric Pang
Patrick Spieker
Qiang Andrew Yu
Jane Jianyang Zhang
TA office hours:
CSE 007 Mondays 1:30–2:30 pm Jane
CSE 2nd floor breakoutTuesdays 2:30–3:30 pm Deric
CSE 5th floor breakout Tuesdays 3:30–4:30 Kousuke
CSE 220 Wednesdays 12–1 pm John
CSE 3rd floor breakout Thursdays 12:30–1:30 pmPatrick
CSE 021 Fridays 3:30–4:30 pm Andrew
Final exam:tentatively Wednesday, December 13, 8:30–10:20 am
Email course staff:cse446-staff@cs.washington.edu


Official catalogue description: Methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling: decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering.

1 Course Plan

date
topic and readings (chapter in [1])
deadlines












(week 1) 9/27 introduction project part 1 released






9/28 quiz: Python review






9/29 decision trees 1 A1 released






10/2 (continued)






(week 2) 10/4 limits of learning 2






10/5 quiz: linear algebra review [tarball]






10/6 geometry and nearest neighbors 3






10/9 perceptron 4






(week 3)10/11 (continued)






10/12quiz review [tarball] A1 due Th. 10/12






10/13unsupervised learning: K-means++ (guest lecture by Swabha Swayamdipta)15






10/16 unsupervised learning: principal components analysis dataset due Tu. 10/17; A2 released






(week 4)10/18practical issues: features, performance 5 project part 2 released






10/19quiz: dataset discussion






10/20practical issues: performance, bias/variance






10/23learning as minimizing loss 7 A2 due Tu. 10/24






(week 5)10/25 and specifically, log loss






10/26quiz: loss and related topics






10/27optimization algorithms; regularizers A3 released






10/30probabilistic models: logistic and linear regression 9 project part 2 due M. 10/30






(week 6) 11/1 MLE and MAP






11/2 quiz: probability review






11/3 probabilistic generative models and naïve Bayes official datasets announced
A3 due Su. 11/5






11/6 neural networks 10 A4 released; project part 3 released






(week 7) 11/8 neural networks: backpropagation






11/9 quiz TBD






11/10holiday






11/13bias and fairness 8 A4 due Tu. 11/14






11/15 (continued); kernel methods 11






11/16quiz TBD






(week 8)11/17support vector machines 7.7






11/20(continued); beyond binary classification 11.6, 6






11/22“office hour”






11/23 holiday






11/24holiday






(week 9)11/27learning theory 12 A5 due Tu. 11/28






11/29ensemble methods 13






11/30 quiz TBD






12/1 (continued)






(week 10) 12/4 expectation-maximization 16 project due Tu. 12/5






12/6 probablistic graphical models [2]






12/7 quiz TBD






12/8 finale






12/13 exam (8:30 am–10:20 am)






The table above shows the planned lectures, along with readings. The official textbook for the course is Daume [1], which is available online at no charge (http://ciml.info).

2 Evaluation

Students will be evaluated as follows:

3 Academic Integrity

Read, sign, and return the academic integrity policy for this course before turning in any work.

References

[1]    Hal Daume. A Course in Machine Learning (v0.9). Self-published at http://ciml.info/, 2017.

[2]    Daphne Koller, Nir Friedman, Lise Getoor, and Ben Taskar. Graphical models in a nutshell, 2007. URL https://ai.stanford.edu/~koller/Papers/Koller+al:SRL07.pdf.