Syllabus Overview:
-
Week 1: introduction & decision trees
-
3/28
- Introduction
- Reading: Murphy 1.1 - 1.3
-
3/30
- Decision trees
- Slides
- Lecture Notes
- Reading: Murphy 16.2.1 - 16.2.4
- Read Murphy Chapter 2 for probability background (if needed)
- Optional reading: Mitchell: Chapter 3
- Optional reading: Friedman: 9.2
-
4/1
- Decision trees
- Slides
- Lecture Notes
- Reading: [same as 3/30]
-
Week 2: Decision trees & point estimation
-
4/4 (Monday)
- Decision trees
- Slides
- Lecture Notes
- Reading: [same as 3/30]
- Homework 1 out.
-
4/6
- Point estimation
- Lecture Notes
- Reading: probability review (as needed): Murphy 2.1, 2.2, 2.5
- Reading: generative models: Murphy 3.1, 3.2, 3.3
- Reading: Bayesian statistics: Murphy 5.1, 5.2
- Reading: Gaussians (we will probably only get this far on Fri): Murphy 4.1
- Optional reading: Mitchell 6.1 - 6.6
4/7
- Recitation: Python Review
-
4/8
- Point estimation
- Lecture Notes
- Reading: [same as 4/6]
-
Week 3: Linear regression
-
4/11
- Linear regression
- Lecture Notes
- Reading: Murphy 7.1, 7.2, 7.3, 7.5.1, 7.5.4
- Optional reading: Friedman 3.1, 3.2, 3.4.1, 3.4.2
- Optional reading: Bishop 3.1.1, 3.1.2, 3.1.3, 3.1.4
-
4/13
- Linear regression
- Lecture Notes
- Slides
- Reading: [same as 4/11]
4/14
- Recitation: Linear Algebra Review
-
4/15
- Naive Bayes
- Lecture Notes
- Slides
- Reading: Murphy 3.5
- Optional reading: Mitchell 6.9
-
Week 4: Naive Bayes
-
4/18 (Monday)
- Homework 2 out. [pdf] [tex(zip)][data (mnist.zip)]
- Naive Bayes
- Lecture Notes
- Slides
- Reading: [same as 4/15]
-
4/20
- Logistic Regression
- Lecture Notes
- Slides
- Reading: Murphy 8.1, 8.2, 8.3
- Optional reading: Friedman 4.4
- Programming section out for homework2
-
4/22 (Friday)
- Homework 1 due.
- Logistic Regression
- Lecture Notes
- Reading: Murphy 8.6
-
Week 5: Neural networks
-
4/25
- Logistic Regression and Neural Networks
- Lecture Notes
- Reading: Murphy 16.5
- Optional reading: Bishop 5.1, 5.2, 5.3
- Optional reading: Friedman 11.3, 11.4
- Optional reading: Mitchell 4.1, 4.2, 4.3, 4.5, 4.6
-
4/27
- Neural Networks
- Lecture Notes
- Reading: [same as 4/25]
-
4/29
- Neural Networks
- Lecture Notes
- Reading: [same as 4/25]
-
Week 6: Support vector machines (SVMs)
-
5/2 (Monday)
- Support vector machines (SVMs)
- Homework 3 out.
- Midterm Study Topics
- Lecture Notes
- Reading: Murphy 8.5.4
- Additional reading: Andrew Ng's lecture notes 1-6 (highly recommended, though notation is a little different from mine)
- Optional reading: Bishop 7.1
- Optional reading: Friedman 12.1, 12.2
-
5/4 (Wednesday)
- Lecture Notes
- Reading: Murphy 14.2, 14.5.2, 14.5.3, 14.5.4
- Additional reading: Andrew Ng's lecture notes 7
- Optional reading: Friedman 12.3
-
5/5 (Thursday)
- Midterm review.
-
5/6 (Friday)
- Midterm.
-
Week 7: Model ensembles
-
5/9 (Monday)
- Homework 2 due.
- SVM Lecture Notes
- Ensembles Lecture Notes
- Ensembles Slides
- Reading: Murphy 16.2.5, 16.4
- Additional reading: John Duchi's lecture notes
- Optional reading: Bishop 14.1, 14.3
- Optional reading: Friedman 10.1, 10.2, 10.3, 10.4
-
5/11 (Wednesday)
- Lecture Notes
- Slides
- Reading: [same as 5/9]
-
5/13 (Friday)
- Lecture Notes
- Slides
- Reading: Murphy 6.3, 6.4, 6.5
- Optional reading: Friedman 7.1 - 7.10
- Optional reading: Mitchell Chap 7
-
Week 8: Learning theory, clustering
-
5/16 (Monday)
- Lecture Notes
- Slides
- Additional reading: No Free Lunches for Anyone
- Reading: [same as 5/13]
-
5/18 (Wednesday)
- Lecture Notes
- Slides
- Lecture: watch online and come to class with questions! See Piazza
- Reading: Murphy 11.1, 11.2, 11.3, 11.4.1, 11.4.2
- Optional reading: Friedman 13.2.1 - 13.2.3
- Optional reading: Bishop Chap 9
-
5/19 (Thursday)
- Section: review midterm questions and differentiation
-
5/20 (Friday)
- Lecture Notes
- Slides
- Lecture: watch online and come to class with questions! See Piazza
- Reading: [same as 5/18]
-
Week 9: Dimensionality reduction
-
5/23 (Monday)
- Homework 3 due.
- Homework 4 out. [pdf] [tex][data (zip)]
- Lecture Notes
- Reading: [same as 5/18]
-
5/25 (Wednesday)
- Lecture Notes
- Reading: Murphy 1.3.2, 12.2
-
5/27 (Friday)
- Lecture Notes
- Reading: [same as 5/25]
Week 10: Review & conclusions
-
Week 11: Finals week
-
6/6 (Monday)
- Homework 4 due.
-
6/8 (Wednesday)
- Final exam: 8:30 am
Text Books:
- Machine Learning: a Probabilistic Perspective, Kevin Murphy, MIT Press, 2013.
- Optional: Pattern Recognition and Machine Learning, Christopher Bishop, Springer, 2007.
- Optional: Machine Learning, Tom Mitchell, McGraw-Hill, 1997.
- Optional: The Elements of Statistical Learning, Friedman, Tibshirani, Hastie, Springer, 2001.
Homeworks:
We will have 4 homework assignments, which will be listed below as they are assigned. The assignments will be given out roughly in weeks 2, 4, 6, and 8, and you will have two weeks to complete each one.
- Assignment 1: Decision Trees, Point Estimation (Due: Friday 4/22)
- Assignment 2: Supervised Learning I: Regression, Naive Bayes, Neural nets (Due: Monday 5/9)
- Assignment 3: Supervised Learning II: SVMs and Ensembles (Due: Monday 5/23)
- Assignment 4: Unsupervised learning (Due: Moday 6/6)
Submission instructions will be posted here once the first homework is assigned.
Please be careful to not overwrite an in time assignment with a late assignment when uploading near the deadline.
Each student has three penalty-free late day for the whole quarter, other than that any late submission will be penalized for each day it is late.
Exam:
There will be final and midterm (6th week) exams for this course (Time and location TBA). The exams are open note, you are welcome to bring the book, the lecture slides, and any handwritten notes you have.
Grading:
The final grade will consist of homeworks (65%), a midterm exam (10%), a cumulative final exam (20%), and in-class participation (5%).
Course Administration and Policies
- Assignments will be done individually unless otherwise specified. You may discuss the subject matter with other students in the class, but all final answers must be your own work. You are expected to maintain the utmost level of academic integrity in the course.
- As we sometimes reuse problem set questions from previous years, please do not to copy, refer to, or look at any solution keys while preparing your answers. Doing so will be regarded as cheating. We expect you to want to learn and not google for answers.
- Each student has three penalty-free late day for the whole quarter. Beyond that, late submissions are penalized (10% of the maximum grade per day)
- Comments can be sent to the instructor or TA using this anonymous feedback form . We take all feedback very seriously and will do whatever we can to address any concerns.