Exam:
The final exam is on Wednesday, June 7 2017, 10:30am  12:20pm. The exam is open, you are welcome to bring the book, the lecture slides, and any handwritten notes you have. Laptops/tablets are allowed, but internet access is NOT allowed.
Syllabus Overview:

3/28
 Introduction
 Readings (Murphy): 1.11.4

3/30
 Introduction (continued)
 Readings (Murphy): Same as 3/28

4/4
 Decision Trees
 Readings (Murphy): 16.2, 16.4

4/6
 Point Estimation
 Homework 1: [pdf][tex]
 Readings (Murphy): 2.4.1

4/11
 Linear Regression
 Linear Regression (marked)

Readings (Murphy):
 Linear Regression Model: 7.17.3
 Error metrics, overfitting, biasvar tradeoff: 6.4
 Regularization, ridge regression: 7.5.1
 LASSO: 13.1, 13.313.4.1

4/13
 Linear Regression (continued)
 Linear Regression (marked) (continued)
 Readings (Murphy): Same as 4/11

4/18
 Naive Bayes
 Readings (Murphy): 3.5

4/20
 Logistic Regression
 Readings (Murphy): 8.1  8.3, 8.5.2
 Homework 1 is due before the class.
 Homework 2: [pdf] [tex]

4/25
 Logistic Regression
 Readings (Murphy): Same as 4/20

4/27
 Perceptron
 Readings (Murphy): 8.5.0, 8.5.4

5/2
 Support Vector Machines
 Support Vector Machines (marked)
 Readings (Murphy): 14.5

5/4
 Kernels (annotated)
 Readings (Murphy): 14.4
 Homework 2 is due before the class.
 Homework 3: [pdf] [tex]

5/9
 Boosting
 Readings (Murphy): 16.4

5/11
 Boosting (continued)
 Readings (Murphy): Same as 5/9

5/16
 Clustering

Readings (Murphy):
 Clustering & Kmeans: 11.4.2.5
 Mixtures of Gaussians: 11.2
 EM: 11.4  11.4.2.3

5/18
 Clustering (annotated)
 Readings (Murphy): Same as 5/16
 Homework 3 is due before the class.
 Homework 4: [.pdf][.tex][country data]

5/23
 PCA
 Readings (Murphy): 12.1.0, 12.2

5/25
 PCA
 Readings (Murphy): 12.1.0, 12.2

5/30

6/1
 Neural Networks
 Homework 4 is due 11:59PM.
Text Books:
 Machine Learning: a Probabilistic Perspective, Kevin Murphy, MIT Press, 2013.
 Optional: Pattern Recognition and Machine Learning, Christopher Bishop, Springer, 2007.
 Optional: Machine Learning, Tom Mitchell, McGrawHill, 1997.
 Optional: The Elements of Statistical Learning, Friedman, Tibshirani, Hastie, Springer, 2001.
Homeworks:
We will have 4 homework assignments, which will be listed below as they are assigned. The assignments will be given out roughly in weeks 2, 4, 6, and 8, and you will have two weeks to complete each one.
 Assignment 1: Decision Trees
 Assignment 2: Classifiers: Naive Bayes, Perceptron, Logistic Regression
 Assignment 3: SVMs and Ensembles
 Assignment 4: kMeans and EM.
Note that there is a deadline for each assignment. Anything uploaded after the deadline will be marked late. Please be careful to not overwrite an in time assignment with a late assignment when uploading near the deadline.
Each student has three penaltyfree late day for the whole quarter, other than that any late submission will be penalized for each day it is late.
Please let the TA know if you cannot access any of the pages.
Grading:
The final grade will consist of homeworks (70%), a final exam (25%), and course participation (5%).
Course Administration and Policies
 Assignments will be done individually unless otherwise specified. You may discuss the subject matter with other students in the class, but all final answers must be your own work. You are expected to maintain the utmost level of academic integrity in the course.
 As we sometimes reuse problem set questions from previous years, please do not to copy, refer to, or look at any solution keys while preparing your answers. Doing so will be regarded as cheating. We expect you to want to learn and not google for answers.
 Each student has three penaltyfree late day for the whole quarter. Beyond that, late submissions are penalized (10% of the maximum grade per day)
 Comments can be sent to the instructor or TA using this anonymous feedback form (coming soon). We take all feedback very seriously and will do whatever we can to address any concerns.