Machine Learning for Computer Graphics
CSE590B, Winter 2002

Instructor: Aaron Hertzmann (hertzman@cs), Sieg 407a
Time: Monday 2:30-4:00pm; Wednesday 4:00-5:30pm
Place: EE1-031
Credits: 1 or 3 (depending on participation)

Sophisticated computer graphics applications require complex models of appearance, human motion, natural phenomena, and even artistic style. Such models are often difficult or impossible to design by hand. Recent research in machine learning demonstrates that, instead, we can "learn" a dynamical and/or appearance model from captured data, and then use the model to synthesize plausible new data. For example, we can capture the motions of a human actor, and then generate new motions as they might be performed by that actor.

In this course, we will survey basic principles of machine learning, and how they can be applied to real problems in computer graphics and animation. The format will be a mix of lectures, student paper presentations, and discussion. The final projects will be research-oriented, intended to explore new areas of this emerging field, and ultimately lead to quals projects and publications in leading graphics, vision, and learning conferences.

Machine learning topics covered: Density estimation, Bayes' rule, mixture models and the EM algorithm, Markov Random Fields, regression, information theoretic methods, style-content separation.

Computer graphics applications covered: Face and body modeling, motion synthesis, image and video texture synthesis, non-photorealistic rendering.

Prerequisites: CSE grads or instructor permission; familiarity with probability, statistics, and linear algebra required. Experience with computer graphics useful but not required.

Recommended textbooks:

Calendar: Winter 2002 is Jan 7-March 15 (Jan 21 and Feb 18 are holidays); project presentations during exam week.

Class mailing list archive


Some related links:


Date Topic Readings
January 9 Introduction and overview  
January 14 Statistics and probability
  • Density estimation
  • Multinomial densities
  • Gaussian distributions
  • Bayes' rule
  • PCA

Bishop, p. 1-23, 33-46

Related reading:

January 16 Mixture models
  • K-means clustering
  • Gaussian mixture models
  • EM algorithm

Project 1 hand-out

  • Bishop, p. 49-73, 310-313
Related readings:
  • J. Bilmes. A Gentle Tutorial on the EM Algorithm and its Application to Parameter Estimation for Gaussian Mixture and Hidden Markov Models ICSI-TR-97-021, 1997. paper
  • T.P. Minka. Expectation-Maximization as lower bound maximization. website
January 23 Paper presentations and discussion
  • T. Vetter, V. Blanz. A Morphable Model for the Synthesis of 3D Faces. SIGGRAPH 99. website with demos [Brett]
  • Y. Weiss and E.H. Adelson. Perceptually organized EM: A framework for motion segmentation that combines information about form and motion. MIT Media Lab Perceptual Computing Section TR #315 (1994) paper [Aseem]
  • N. Jojic, B. Frey. Learning flexible sprites in video layers. CVPR 2001. paper [Aseem]
Related reading
  • M. Kirby and L. Sirovich. Application of the k-l procedure for the characterization of human faces. IEEE Transc. On Pattern Analysis and Machine Intelligence, 12(1):103--108, Jan. 1990. paper
  • M. Turk and A. Pentland, "Face recognition using eigenfaces," CVPR 1991. paper
  • T.F. Cootes and C.J. Taylor, "Statistical models of appearance for medical image analysis and computer vision", Proc. SPIE Medical Imaging 2001. website paper
January 28 Paper presentations and discussion
  • H. Lensch, J. Kautz, M. Goesele, W. Heidrich, and H.-P. Seidel . Image-Based Reconstruction of Spatially Varying Materials Proceedings of the EG Rendering Workshop '01. paper website with demos [Gary]
  • J. Malik, S. Belongie, T. Leung, and J. Shi. Contour and Texture Analysis for Image Segmentation. To appear in International Journal of Computer Vision, 2001. website [Steve]
  • G. Doretto, P. Pundir, Y. Wu, S. Soatto. Dynamic textures. ICCV 2001. Website paper [Wil]
Related reading:
January 30 and Feb 4 Hidden Markov Models
  • L. R. Rabiner, "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition," Proc. of the IEEE, Vol.77, No.2, pp.257--286, 1989. You only need to read up to page 276
Related reading:
  • Z. Ghahramani. An Introduction to Hidden Markov Models and Bayesian Networks. IJPRAI. Vol 15, No 1, 2001, p. 9. paper
Feb 4, 6 Paper presentations and discussion
  • M. Brand. Voice Puppetry. SIGGRAPH 99. website paper [Antoine]
  • M. Brand and A. Hertzmann. Style Machines. SIGGRAPH 2000. website [Li]
  • C. Guo, S. Zhu, Y. Wu. Visual Learning by Integrating Descriptive and Generative Methods. paper. website [Dan G]
  • J. B. Tenenbaum, W. T. Freeman. Separating style and content with bilinear models. Neural Computation 12 (6), 1247-1283. website paper [David G]
Feb 11

Graphical models and Markov Random Fields

Project 1 due

  • R. Cowell. Introduction to Inference for Bayesian Networks. In Learning in Graphical Models
  • J. S. Yedidia, W. T. Freeman, Y. Weiss. Understanding Belief Propagation and its Generalizations. website paper
  • A. Efros, T. Leung. Texture synthesis by Non-Parametric Sampling. webpage, paper

Related reading:

  • Readings on texture synthesis and texture transfer
  • W. T. Freeman, E. C. Pasztor , and O. T. Carmichael. Learning low-level vision. Intl. Journal of Computer Vision, 40(1), pp. 25-47, 2000). website
  • S. Geman and D. Geman. Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images. IEEE Transactions on Pattern Analysis and Machine Intelligence. Vol. PAMI-6, No 6, Nov. 1984.
  • S.Z. Li. ``Modeling Image Analysis Problems Using Markov Random Fields''. In C.R. Rao and D.N. Shanbhag (ed), Stochastic Processes: Modeling and Simulation, Volume 20 of Handbook of Statistics. Elsevier Science. (to appear in 2001). paper
  • M. Wainwright, T. Jaakkola, and A. Willsky. Tree-based reparameterization for approximate estimation on loopy graphs. In Advances in Neural Information processing systems 14, 2001. paper
  • T. P. Minka. Expectation Propagation for approximate Bayesian inference. website
  • C. Liu, H. Shum, C. Zhang. A Two-Step Approach to Hallucinating Faces: Global Parametric Model and Local Nonparametric Model. CVPR 2001. paper, powerpoint
Feb 13 Markov Chain Monte Carlo (MCMC)
  • D. J. MacKay. Introduction to Monte Carlo Methods. In M. Jordan (ed), Learning for Graphical Models. paper
Feb 18 Holiday --- No meeting  
Feb 20 Paper Presentations and Discussions
  • H. Chen, Y. Xu, H. Shum, S. Zhu, N. Zheng. Example-based Facial Sketch Generation with Non-Parametric Sampling. ICCV 2001. paper [Yung-Yu]
  • Stephen Chenney and D.A.Forsyth, "Sampling Plausible Solutions to Multi-Body Constraint Problems". SIGGRAPH 2000 Conference Proceedings, pages 219-228, July 2000. website [Daniel]
  • Eric Veach and Leonidas J. Guibas. Metropolis Light Transport. SIGGRAPH 97 Proceedings, pp. 65-76. website [Karen]
Related reading:
  • J. Kajiya. The Rendering Equation. SIGGRAPH 86. p. 143-150. paper
Feb 25 Regression and classification (deterministic methods)
  • k-nearest neighbors
  • RBFs
  • Neural nets
  • SVMs
  • Bishop, p. 77-97, 116-132, 140-146
Related reading:
  • C. J. C. Burges. A Tutorial on Support Vector Machines for Pattern Recognition. 1998 paper, more tutorials
Feb 27 Information theory; Paper presentations and discussion
  • Cover and Thomas, p 1-33.
  • Radek Grzeszczuk, Demetri Terzopoulos, Geoffrey Hinton. NeuroAnimator: Fast Neural Network Emulation and Control of Physics-Based Models. SIGGRAPH 98. website. [Ben]
  • Petros Faloutsos, Michiel van de Panne and Demetri Terzopoulos "Composable Controllers for Physics-based Character Animation." SIGGRAPH 2001. website paper [Jia-Chi]
Related reading:
  • R. Grzeszczuk, D. Terzopoulos and G. Hinton. Fast Neural Network Emulation of Dynamical Systems for Computer Animation, in Advances in Neural Information Processing Systems: Proceedings of the 1998 Conference (NIPS11), MIT Press, pp.882-888.
  • W.T. Freeman, J.B. Tenenbaum, E. Pasztor. An example-based approach to style translation for line drawings. MERL TR99-11. website.
March 4 Model selection
  • E. T. Jaynes 1982, On the Rationale of Maximum-Entropy Methods, Proc. IEEE., 70, 939; paper
  • P. Grünwald. The Minimum Description Length Principle and Reasoning under Uncertainty,Introduction and Chaper 1. webpage
  • M. Brand, Pattern Discovery via Entropy Minimization. UAI '99. website
  • D. J. MacKay. Bayesian Interpolation. Neural Computation: 4:3, p 448-472. paper
  • P. Domingos. The Role of Occam's Razor in Knowledge Discovery. Data Mining and Knowledge Discovery, 3 (4), 1999. paper
Related reading:
  • P.M.B. Vitanyi and M. Li, Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity, IEEE Trans. Inform. Theory, IT-46:2(2000), 446--464. paper
  • J. Rissanen. Hypothesis Selection and Testing by the MDL Principle paper
  • MDL webpage
  • Jaakkola, Meila, Jebara. Maximum Entropy Discrimination. NIPS 99. paper
  • Jaynes, E. T., 1988, `How Does the Brain Do Plausible Reasoning?' (1.3Mb), in Maximum-Entropy and Bayesian Methods in Science and Engineering, 1, G. J. Erickson and C. R. Smith (eds.), Kluwer, Dordrecht, p. 1; paper
  • C.E. Rasmussen and Z. Ghahramani. Occam's Razor. NIPS 2000. paper
  • M. Brand, Structure Discovery in conditional probability distributions via an entropic estimator. website
March 6 Model selection, ensembles, aggregation
  • D. Pelleg and A. Moore, X-means: Extending K-means with Efficient Estimation of the Number of Clusters, International Conference on Machine Learning, 2000 (ICML2000). website
  • Robert E. Schapire. The boosting approach to machine learning: An overview. In MSRI Workshop on Nonlinear Estimation and Classification, 2002. paper
Related reading:
  • Stolcke and Omohundro. HMM model merging.
March 11 Dimension reduction and feature selection; paper presentations and discussion
  • Bell A.J. and Sejnowski T.J. 1995. An information maximisation approach to blind separation and blind deconvolution, Neural Computation, 7, 6, 1129-1159 paper [Mira]
  • B. Schölkopf, A. Smola, and K.-R. Müller. Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation, 10:1299-1319, 1998. paper [Colin]
  • S. Mika, B. Schölkopf, A. Smola, K.-R. Müller, M. Scholz, and G. Rätsch. Kernel PCA and de-noising in feature spaces. NIPS 99. paper[Colin]
Related reading:
  • S. Roweis, L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science v.290 no.5500, Dec 2000. webpage paper
  • C. Chennubhotla, A. Jepson. Sparse PCA: Extracting Multi-scale Structure from Data. website
  • D. D. Lee and H. S. Seung. Learning the parts of objects by non-negative matrix factorization. Nature 401, 788-791 (1999). paper
  • B. J. Frey and N. Jojic 2000. Transformation-invariant clustering and dimensionality reduction. Submitted to IEEE Transactions on Pattern Analysis and Machine Intelligence, Nov. 2000. website
  • S. Roweis, L. Saul, G. Hinton. Global Coordination of Local Linear Models. NIPS '01. paper
  • S. Roweis. EM Algorithms for PCA and SPCA. NIPS '97. paper
  • Ghahramani, Z. and Hinton, G.E. (1996) The EM Algorithm for Mixtures of Factor Analyzers University of Toronto Technical Report CRG-TR-96-1, 8 pages (short note). paper
  • J. W. Fisher III, T. Darrell, W. T. Freeman, P. Viola, Learning Joint Statistical Models for Audio-Visual Fusion and Segregation, Advances in Neural Information Processing Systems, Denver, Colorado, November 28-December 2, 2000. paper
  • J. Principe, D. Xu, and J. Fisher. Information-Theoretic Learning, chapter 7. Wiley, 1999. chapter
  • J. B. Tenenbaum, V. De Silva, J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science 290 (5500): 22 December 2000. website
March 13 Wrap-up  
Tuesday, March 19, 2:30-3:20pm Project Presentations