2.6 Eigenvalues and Eigenvectors
Consider a square matrix c. If
for some scalar λ and some vector ν ≠ 0, then we call λ an eigenvalue and ν an eigenvector of c. This means that an eigenvector of a matrix is any vector for which multiplying by the matrix is not different from multiplying by a scalar (the eigenvalue).
Suppose λ is an eigenvalue and ν is a corresponding eigenvector of c. For any scalar a, the vector aν is also an eigenvector corresponding to eigenvalue λ. This follows because
Accordingly, eigenvectors are uniquely determined only up to scalar multiplication. If a set of eigenvectors are linearly independent, we say they are distinct. To determine eigenvalues and eigenvectors of a matrix, we focus first on the eigenvalues. Rearranging [2.66] we obtain
where I is the identity matrix. This equation will hold for some nonzero vector ν if and only if the matrix (c − λI) is singular. Accordingly, we seek values for which the matrix (c − λI) has a determinant of 0. Consider matrix
This has determinant
which is a third-order polynomial. It has roots λ = −1, 2 and 3. We find corresponding eigenvectors ν by substituting the eigenvalues into [2.68] and solving. For example, with λ = −1, [2.68] becomes:
By inspection, a solution is ν = (−1, −3, 3). Obviously, any multiple of this is also a solution. We repeat the same analysis for the other eigenvalues. Results are indicated in Exhibit 2.7.
The approach we employed in our example is useful for deriving an important result. Consider an arbitrary n × n matrix c. To find its eigenvalues, we construct the determinant of (c − λI) and set it equal to 0. This results in an nth-order polynomial equation. By the fundamental theorem of algebra, it has n solutions. We conclude that every matrix has n eigenvalues. Of course, some may be complex. Others may be repeated. In practical applications, eigenvalues are not calculated in this manner. Although setting the determinant of (c − λI) equal to 0 and solving is theoretically useful, there are more efficient algorithms, which are implemented in various software packages. See Strang (2005).
Eigenvalues have a number of convenient properties. A matrix and its transpose both have the same eigenvalues. If λ is an eigenvalue of a nonsingular matrix, then 1/λ is an eigenvalue of its inverse. The product of the eigenvalues of a matrix equals its determinant.
2.6.2 Intuitive Example
Consider an intuitive example. A sphere of unit radius is positioned at the center of a three-dimensional coordinate system. It is rotating about the x3-axis. The matrix
describes a one-eighth (45°) rotation of the sphere. For example, multiplying c by the vector (1, 0, 0) yields the vector (.7071, .7071, 0) which is rotated 45°. This is depicted in Exhibit 2.8.
Intuitively, what might we expect to be an eigenvector of the matrix c? Is there a point on the unit sphere that a 45° rotation transforms into a multiple of itself? Of course! Consider the point at the north pole. It is the point (0, 0, 1), and it is transformed into itself. We conclude that an eigenvector of c is the vector (0, 0, 1). The corresponding eigenvalue is 1. Because it is a 3 × 3 matrix, c has two other eigenvalues, but they are both complex numbers.
Find the eigenvalues and eigenvectors of the matrix
Prove that the eigenvalues of a diagonal matrix are its diagonal elements.
Use one of the stated properties of eigenvalues to prove that a matrix is singular if and only if it has 0 as one of its eigenvalues.