Revision as of 14:35, 22 October 2007 by imported>Michael Underwood
In linear algebra an eigenvalue of a (square) matrix is a number that satisfies the eigenvalue equation,
where det means the determinant, is the identity matrix of the same dimension as ,
and in general can be complex.
The origin of this equation is the eigenvalue problem, which is to find the eigenvalues and associated eigenvectors of .
That is, to find a number and a vector that together satisfy
What this equation says is that even though is a matrix its action on is the same as multiplying the vector by the number .
This means that the vector and the vector are parallel (or anti-parallel if is negative).
Note that generally this will not be true. This is most easily seen with a quick example. Suppose
- and
Then their matrix product is
whereas the scalar product is
Obviously then unless
and simultaneously ,
and it is easy to pick numbers for the entries of and such that this cannot happen for any value of .
The eigenvalue equation
So where did the eigenvalue equation come from? Well, we assume that we know the matrix and want to find a number and a non-zero vector so that . (Note that if then the equation is always true, and therefore uninteresting.) So now we have
. It doesn't make sense to subtract a number from a matrix, but we can factor out the vector if we first multiply the right-hand term by the identity, giving us
Now we have to remember the fact that is a square matrix, and so it might be invertible.
If it was invertible then we could simply multiply on the left by its inverse to get
but we have already said that can't be the zero vector! The only way around this is if is in fact non-invertible. It can be shown that a square matrix is non-invertible if and only if its determinant is zero. That is, we require
which is the eigenvalue equation stated above.
A more technical approach
So far we have looked eigenvalues in terms of square matrices. As usual in mathematics though we like things to be as general as possible, since then anything we prove will be true in as many different applications as possible. So instead we can define eigenvalues in the following way.
Definition: Let be a vector space over a field , and let be a linear map. An eigenvalue associated with is an element for which there exists a vector such that
Then is called the eigenvector of associated with .