Syllabus Overview:

  • Week 1: introduction & decision trees

    • 3/28
    • 3/30
      • Decision trees
      • Slides
      • Lecture Notes
      • Reading: Murphy 16.2.1 - 16.2.4
      • Read Murphy Chapter 2 for probability background (if needed)
      • Optional reading: Mitchell: Chapter 3
      • Optional reading: Friedman: 9.2
    • 4/1
  • Week 2: Decision trees & point estimation

    • 4/4 (Monday)
    • 4/6
      • Point estimation
      • Lecture Notes
      • Reading: probability review (as needed): Murphy 2.1, 2.2, 2.5
      • Reading: generative models: Murphy 3.1, 3.2, 3.3
      • Reading: Bayesian statistics: Murphy 5.1, 5.2
      • Reading: Gaussians (we will probably only get this far on Fri): Murphy 4.1
      • Optional reading: Mitchell 6.1 - 6.6
    • 4/7
      • Recitation: Python Review
    • 4/8
  • Week 3: Linear regression

    • 4/11
      • Linear regression
      • Lecture Notes
      • Reading: Murphy 7.1, 7.2, 7.3, 7.5.1, 7.5.4
      • Optional reading: Friedman 3.1, 3.2, 3.4.1, 3.4.2
      • Optional reading: Bishop 3.1.1, 3.1.2, 3.1.3, 3.1.4
    • 4/13
    • 4/14
      • Recitation: Linear Algebra Review
    • 4/15
  • Week 4: Naive Bayes

  • Week 5: Neural networks

    • 4/25
      • Logistic Regression and Neural Networks
      • Lecture Notes
      • Reading: Murphy 16.5
      • Optional reading: Bishop 5.1, 5.2, 5.3
      • Optional reading: Friedman 11.3, 11.4
      • Optional reading: Mitchell 4.1, 4.2, 4.3, 4.5, 4.6
    • 4/27
    • 4/29
  • Week 6: Support vector machines (SVMs)

    • 5/2 (Monday)
    • 5/4 (Wednesday)
    • 5/5 (Thursday)
      • Midterm review.
    • 5/6 (Friday)
      • Midterm.
  • Week 7: Model ensembles

  • Week 8: Learning theory, clustering

    • 5/16 (Monday)
    • 5/18 (Wednesday)
      • Lecture Notes
      • Slides
      • Lecture: watch online and come to class with questions! See Piazza
      • Reading: Murphy 11.1, 11.2, 11.3, 11.4.1, 11.4.2
      • Optional reading: Friedman 13.2.1 - 13.2.3
      • Optional reading: Bishop Chap 9
    • 5/19 (Thursday)
      • Section: review midterm questions and differentiation
    • 5/20 (Friday)
      • Lecture Notes
      • Slides
      • Lecture: watch online and come to class with questions! See Piazza
      • Reading: [same as 5/18]
  • Week 9: Dimensionality reduction

  • Week 10: Review & conclusions

      • 6/1 (Wednesday)
      • 6/2 (Thursday)
        • Section: final review
      • 6/3 (Friday)
    • Week 11: Finals week

      • 6/6 (Monday)
        • Homework 4 due.
      • 6/8 (Wednesday)
        • Final exam: 8:30 am

    Text Books:


    Homeworks:

    We will have 4 homework assignments, which will be listed below as they are assigned. The assignments will be given out roughly in weeks 2, 4, 6, and 8, and you will have two weeks to complete each one.

    • Assignment 1: Decision Trees, Point Estimation (Due: Friday 4/22)
    • Assignment 2: Supervised Learning I: Regression, Naive Bayes, Neural nets (Due: Monday 5/9)
    • Assignment 3: Supervised Learning II: SVMs and Ensembles (Due: Monday 5/23)
    • Assignment 4: Unsupervised learning (Due: Moday 6/6)

    Submission instructions will be posted here once the first homework is assigned.

    Please be careful to not overwrite an in time assignment with a late assignment when uploading near the deadline.

    Each student has three penalty-free late day for the whole quarter, other than that any late submission will be penalized for each day it is late.


    Exam:

    There will be final and midterm (6th week) exams for this course (Time and location TBA). The exams are open note, you are welcome to bring the book, the lecture slides, and any handwritten notes you have.

    Grading:

    The final grade will consist of homeworks (65%), a midterm exam (10%), a cumulative final exam (20%), and in-class participation (5%).


    Course Administration and Policies

    • Assignments will be done individually unless otherwise specified. You may discuss the subject matter with other students in the class, but all final answers must be your own work. You are expected to maintain the utmost level of academic integrity in the course.
    • As we sometimes reuse problem set questions from previous years, please do not to copy, refer to, or look at any solution keys while preparing your answers. Doing so will be regarded as cheating. We expect you to want to learn and not google for answers.
    • Each student has three penalty-free late day for the whole quarter. Beyond that, late submissions are penalized (10% of the maximum grade per day)
    • Comments can be sent to the instructor or TA using this anonymous feedback form . We take all feedback very seriously and will do whatever we can to address any concerns.