Usually, eigenanalysis must be done on a matrix of W*H x W*H, where W and H are the dimensions of the image.  This can be quite large.  Instead, with the speedup, the operation is performed on a matrix of size M x M, where M is the number of faces used to create the eigenfaces.

First, some definitions:

xi    = face image i as a vector, of length W*H
xmean = mean of all face images
yi = xi - xmean for i = 1, ..., M
A = [y_1 ... y_M] (dimension W*H x M)


We normally perform eigenanalysis on AAT.  This, at W*H x W*H, can be very large.  Instead, consider an eigenvector v and eigenvalue l of ATA:

ATA v = l v

Multiplying both sides with A:

AATA v = l A v

This implies that Av is an eigenvector of AAT, where v is an eigenvector of ATA.  This should be a much faster route to computing the eigenvectors of AAT!  We will also find all the necessary eigenvectors, since the rank of ATA is equal to the rank of AAT.