Outline

Logistics

Course Topics by Week

Learning: Mature Technology

Defining a Learning Problem

Choosing the Training Experience

Choosing the Target Function

The Ideal Evaluation Function

Choosing Repr. of Target Function

Example: Checkers

Target Function

Representation

AI = Representation + Search

Concept Learning

Decision Tree Representation of Edible

Space of Decision Trees

Example: “Good day for tennis”

Experience: “Good day for tennis”

Decision Tree Representation

DT Learning as Search

Simplest Tree

Successors

To be decided:

Intuition: Information Gain

Entropy (disorder) is badHomogeneity is good

Entropy

Information Gain

Gain of Splitting on Wind

Evaluating Attributes

Resulting Tree ….

Recurse!

One Step Later…

Overfitting…

Summary: Learning = Search

Hill Climbing is Incomplete

Version Spaces

Restricted Hypothesis Representation

Consistency

General to Specific Ordering

CorrespondenceA hypothesis = set of instances

Version Space: Compact Representation

Boundary Sets

Candidate Elimination Algorithm

Initialization

Training Example 1

Training Example 2

Training Example 3

A Biased Hypothesis Space

Comparison

An Unbiased Learner

Two kinds of bias

Formal model of learning

PAC Learning

Example of a PAC learner

Sample complexity

Infinite Hypothesis Spaces

Vapnik-Chervonenkis Dimension

Dichotomies of size 0 and 1

Dichotomies of size 2

Dichotomies of size 3 and 4

Ensembles of Classifiers

How voting helps

Constructing Ensembles

Review: Learning

Email: weld@cs.washington.edu

Other information: CSE 592, Lecture 8

Download presentation source