# 2.6 Eigenvalues and Eigenvectors

Consider a square matrix ** c**. If

[2.66]

for some scalar λ and some vector **ν** ≠ 0, then we call λ an **eigenvalue** and **ν** an **eigenvector** of ** c**. This means that an eigenvector of a matrix is any vector for which multiplying by the matrix is not different from multiplying by a scalar (the eigenvalue).

###### 2.6.1 Theory

Suppose λ is an eigenvalue and **ν** is a corresponding eigenvector of ** c**. For any scalar

*a*, the vector

*a*

**ν**is also an eigenvector corresponding to eigenvalue λ. This follows because

[2.67]

Accordingly, eigenvectors are uniquely determined only up to scalar multiplication. If a set of eigenvectors are linearly independent, we say they are **distinct**. To determine eigenvalues and eigenvectors of a matrix, we focus first on the eigenvalues. Rearranging [2.66] we obtain

[2.68]

where ** I** is the identity matrix. This equation will hold for some nonzero vector

**ν**if and only if the matrix (

*− λ*

**c****) is singular. Accordingly, we seek values for which the matrix (**

*I**− λ*

**c****) has a determinant of 0. Consider matrix**

*I*[2.69]

for which

[2.70]

This has determinant

[2.71]

which is a third-order polynomial. It has roots λ = −1, 2 and 3. We find corresponding eigenvectors **ν** by substituting the eigenvalues into [2.68] and solving. For example, with λ = −1, [2.68] becomes:

[2.72]

By inspection, a solution is **ν** = (−1, −3, 3). Obviously, any multiple of this is also a solution. We repeat the same analysis for the other eigenvalues. Results are indicated in Exhibit 2.7.

The approach we employed in our example is useful for deriving an important result. Consider an arbitrary *n* × *n* matrix * c*. To find its eigenvalues, we construct the determinant of (

*− λ*

**c****) and set it equal to 0. This results in an**

*I**n*-order polynomial equation. By the fundamental theorem of algebra, it has

^{th}*n*solutions. We conclude that every matrix has

*n*eigenvalues. Of course, some may be complex. Others may be repeated. In practical applications, eigenvalues are not calculated in this manner. Although setting the determinant of (

*− λ*

**c****) equal to 0 and solving is theoretically useful, there are more efficient algorithms, which are implemented in various software packages. See Strang (2005).**

*I*Eigenvalues have a number of convenient properties. A matrix and its transpose both have the same eigenvalues. If λ is an eigenvalue of a nonsingular matrix, then 1/λ is an eigenvalue of its inverse. The product of the eigenvalues of a matrix equals its determinant.

###### 2.6.2 Intuitive Example

Consider an intuitive example. A sphere of unit radius is positioned at the center of a three-dimensional coordinate system. It is rotating about the *x*_{3}-axis. The matrix

[2.73]

describes a one-eighth (45°) rotation of the sphere. For example, multiplying ** c** by the vector (1, 0, 0) yields the vector (.7071, .7071, 0) which is rotated 45°. This is depicted in Exhibit 2.8.

**rotates points 45° about the**

*c**x*

_{3}-axis. This is illustrated for the point (1, 0, 0), which it transforms into the point (.7071, .7071, 0).

Intuitively, what might we expect to be an eigenvector of the matrix ** c**? Is there a point on the unit sphere that a 45° rotation transforms into a multiple of itself? Of course! Consider the point at the north pole. It is the point (0, 0, 1), and it is transformed into itself. We conclude that an eigenvector of

**is the vector (0, 0, 1). The corresponding eigenvalue is 1. Because it is a 3 × 3 matrix,**

*c***has two other eigenvalues, but they are both complex numbers.**

*c*###### Exercises

Find the eigenvalues and eigenvectors of the matrix

[2.74]

Prove that the eigenvalues of a diagonal matrix are its diagonal elements.

Solution

Use one of the stated properties of eigenvalues to prove that a matrix is singular if and only if it has 0 as one of its eigenvalues.

Solution