Final Examination
CSE 415: Introduction to Artificial Intelligence
The University of Washington, Seattle, Spring 2016
Date: Tuesday, June 7 (2:30-4:30PM)
Format: The format of the final exam will be similar to that of the midterm exam. However, the exam will be longer. The topics covered will be drawn from the following, which includes some topics from the first part of the course and some from the second.
Topics:
The Turing Test

Python Data Structures

  Dictionaries

  Lists: 
   creating, accessing (including slices), copying,
   deep vs shallow copying
   list comprehensions

ISA hierarchies
  Knowledge representation
  Inferences using partial order properties
  Redundancy via partial order properties
  Inferences using inheritance
    inheritable and noninheritable properties

State-space search
  States, state spaces, operators, preconditions, moves,
  Heuristic evaluation functions,
  Iterative depth-first search, recursive depth-first search,
  Breadth-first search, best-first search, uniform-cost search,
  Iterative deepening, A* search.
  Admissible heuristics
  Genetic search
    Application to the traveling Salesman Problem
  Case-based reasoning

Problem formulation
  States, operators, goal criteria
  Rittel and Webber's 10 characteristics of wicked problems

Minimax search for 2-player, zero-sum games
  Static evaluation functions
  Backed up values
  Alpha-beta pruning
  Zobrist hashing

Propositional Logic
  Satisfiability, consistency
  Perfect induction
  Modus Ponens
  Resolution, including clause form

Probabilistic reasoning
  Conditional probability
  Priors, likelihoods, and posteriors
  Bayes' rule
  Odds and conversion between odds and probability
  The joint probability distribution
  Marginal probabilities

Markov Decision Processes
  States, actions, transition model, reward function
  Values, Q-states, and Q-values
  Bellman updates
  Policies, policy extraction
  Parameters alpha and epsilon used in Q-learning

Perceptrons
  How to compute AND, OR, and NOT.
  Simple pattern recognition (e.g., 5 x 5 binary image
    inputs for optical character recognition)
  Training sets, training sequences, and the perceptron 
    training algorithm.
  Linear separability and the perceptron training theorem.

Classification using Naive Bayes classifiers
  The naive Bayes assumption
  Division by P(E) not necessary for classification
  Adding 1 to counts when estimating P(Ei | Cj): why and how

Hunt's Algorithm and ID3 Decision Tree Learning
  Training sets, attributes, values, classes (categories)
  Entropy corresponding to a bag of elements
  Greedy construction of the tree
  Generalization
  Overfitting

Natural Language Understanding
  Grammars, nonterminals, terminals, productions
  Sentential forms, derivations, the language specified by a grammar
  Sentence generation, parsing
  Case frames
  Controlled language, semantic grammar
  n-grams
  Bag-of-words representation
  Stopwords, stemming, and reference vocabularies
  Vector representation of documents
  Cosine similarity of documents

Robotics
  Asimov's Three Laws of Robotics

The Future of AI
  Kurzweil's "singularity"