Usually, eigenanalysis must
be done on a matrix of W*H x W*H, where W and H are the dimensions
of the
image. This can be quite large. Instead, with the
speedup, the
operation is performed on a matrix of size M x M, where M is the
number of
faces used to create the eigenfaces.

First, some definitions:

x_{i} = face image i as a vector, of length W*H

x_{mean} = mean of all face images

y_{i} = x_{i} - x_{mean} for i = 1, ..., M

A = [y_1 ... y_M] (dimension W*H x M)

We normally perform eigenanalysis on AA^{T}. This, at
W*H x
W*H, can be very large. Instead, consider an eigenvector v and
eigenvalue
l of A^{T}A:

A^{T}A v = l v

Multiplying both sides with A:

AA^{T}A v = l A v

This implies that Av is an eigenvector of AA^{T}, where v is
an eigenvector
of A^{T}A. This should be a much faster route to
computing
the eigenvectors of AA^{T}! We will also find all the
necessary
eigenvectors, since the rank of A^{T}A is equal to the rank
of AA^{T}.

^{
}