"Non-Linear Least Squares and Sparse Matrix Techniques: Fundamentals and
Applications"
Rick Szeliski
Microsoft Research
Talk 1 (4/20): Non-Linear Least Squares and Sparse Matrix Techniques:
Fundamentals Talk 2 (5/3): Non-Linear Least Squares and Sparse Matrix
Techniques: Applications
Linear and non-linear least squares come up everywhere in computer vision and
computer graphics. Not only are they practical techniques for solving inference
problems such as structure from motion and 3D surface fitting, they also come up
in image restoration/manipulation and physical simulation problems. Many of the
interesting (larger) problems have a very sparse structure, i.e., variables
typically interact only with a small number of other variables. (Think of
points that are only seen in certain frames, or pixels that are only adjacent to
a small number of other pixels.)
In this pair of talks, I will cover how to formulate and solve linear and non-
linear least squares problems using both direct and iterative methods. The
topics covered include: linearization (Jacobians), robust norms (very brief), QR
factorization and normal equations, the skyline storage method and sparse
Cholesky (LDU) with minimal fill in, conjugate gradient descent, preconditioning
(both diagonal scaling and hierarchical/multigrid approaches), Levenberg-
Marquardt, and trust region methods.
I will also cover a number of important application areas, including:
triangulation, pose estimation, and structure from motion, sparse data
interpolation and reconstruction from gradients (Poisson equations, shape from
shading), spation-temporal snakes, and others. I will also present recent
results on 3D reconstruction from very long sequences, including recent work on
"Visual Odometry and Map Correlation" (joint work with Anat Levin). I will
discuss some of the open problems that make the solution of such large systems
difficult.