The syllabus is subject to change; always get the latest version from the class website.
|Lectures:||SIG 134, Mondays, Wednesdays, Fridays 9:30–10:20 am|
|Quiz:||Thursdays, various times and places (check time schedule)|
|Instructor:||Noah A. Smith|
|Instructor office hours:||CSE 532, Mondays 12–1 pm and Wednesdays 5–6 pm or by appointment|
|Teaching assistants:||Kousuke Ariga|
|Qiang Andrew Yu|
|Jane Jianyang Zhang|
|TA office hours:||
|Final exam:||tentatively Wednesday, December 13, 8:30–10:20 am|
|Email course staff:||email@example.com|
Official catalogue description: Methods for designing systems that learn from data and improve with experience. Supervised learning and predictive modeling: decision trees, rule induction, nearest neighbors, Bayesian methods, neural networks, support vector machines, and model ensembles. Unsupervised learning and clustering.
topic and readings (chapter in )
|(week 1)||9/27||introduction||project part 1 released|
|9/28||quiz: Python review|
|9/29||decision trees||1||A1 released|
|(week 2)||10/4||limits of learning||2|
|10/5||quiz: linear algebra review [tarball]|
|10/6||geometry and nearest neighbors||3|
|10/12||quiz review [tarball]||A1 due Th. 10/12|
|10/13||unsupervised learning: K-means++ (guest lecture by Swabha Swayamdipta)||15|
|10/16||unsupervised learning: principal components analysis||dataset due Tu. 10/17; A2 released|
|(week 4)||10/18||practical issues: features, performance||5||project part 2 released|
|10/19||quiz: dataset discussion|
|10/20||practical issues: performance, bias/variance|
|10/23||learning as minimizing loss||7||A2 due Tu. 10/24|
|(week 5)||10/25||and specifically, log loss|
|10/26||quiz: loss and related topics|
|10/27||optimization algorithms; regularizers||A3 released|
|10/30||probabilistic models: logistic and linear regression||9||project part 2 due M. 10/30|
|(week 6)||11/1||MLE and MAP|
|11/2||quiz: probability review|
|11/3||probabilistic generative models and na´ve Bayes||official datasets announced|
|A3 due Su. 11/5|
|11/6||neural networks||10||A4 released; project part 3 released|
|(week 7)||11/8||neural networks: backpropagation|
|11/13||bias and fairness||8||A4 due Tu. 11/14|
|11/15||(continued); kernel methods||11|
|(week 8)||11/17||support vector machines||7.7|
|11/20||(continued); beyond binary classification||11.6, 6|
|(week 9)||11/27||learning theory||12||A5 due Tu. 11/28|
|(week 10)||12/4||expectation-maximization||16||project due Tu. 12/5|
|12/6||probablistic graphical models|||
|12/13||exam (8:30 am–10:20 am)|
The table above shows the planned lectures, along with readings. The official textbook for the course is Daume , which is available online at no charge (http://ciml.info).
Students will be evaluated as follows:
Read, sign, and return the academic integrity policy for this course before turning in any work.
 Hal Daume. A Course in Machine Learning (v0.9). Self-published at http://ciml.info/, 2017.
 Daphne Koller, Nir Friedman, Lise Getoor, and Ben Taskar. Graphical models in a nutshell, 2007. URL https://ai.stanford.edu/~koller/Papers/Koller+al:SRL07.pdf.