Skip to main content

Linear Algebra

Module 15 Eigenvalues and Eigenvectors

In this module you will learn
  • The definition of eigenvalues and eigenvectors.
  • That eigenvectors give a particularly nice basis in which to study a linear transformation.
  • How the characteristic polynomial relates to eigenvalues.
From here on out, we will only be considering linear transformations with the same domain and codomain (i.e., transformations \(\mathcal{T}:\R^{n}\to\R^{n}\)). Why? Because that will allow us to compare input and output vectors. By comparing inputs and outputs, we may describe a linear transformation as a stretch, twist, shear, rotation, projection, or some combination of all of these operations.
Figure 15.0.1.
It’s the stretched vectors that we’re most interested in now. If \(\mathcal{T}\) stretches the vector \(\vec v\text{,}\) then \(\mathcal{T}\text{,}\) in that direction, can be described by \(\vec v\mapsto \alpha\vec v\text{,}\) which is an easy-to-understand linear transformation. The “stretch” directions for a linear transformation have a special name—eigen directions—and the vectors that are stretched are called eigenvectors.

Definition 15.0.2. Eigenvector.

Let \(X\) be a linear transformation or a matrix. An eigenvector for \(X\) is a non-zero vector that doesn’t change directions when \(X\) is applied. That is, \(\vec v\neq \vec 0\) is an eigenvector for \(X\) if
\begin{equation*} X\vec v=\lambda \vec v \end{equation*}
for some scalar \(\lambda\text{.}\) We call \(\lambda\) the eigenvalue of \(X\) corresponding to the eigenvector \(\vec v\text{.}\)
The word eigen is German for characteristic, representative, or intrinsic, and we will see that eigenvectors provide one of the best contexts in which to understand a linear transformation.

Example 15.0.3.

Let \(\mathcal{P}:\R^{2}\to\R^{2}\) be projection onto the line \(\ell\) given by \(y=x\text{.}\) Find the eigenvectors and eigenvalues of \(\mathcal{P}\text{.}\)

Solution.

We are looking for vectors \(\vec v\neq \vec 0\) such that \(\mathcal{P}\vec v=\lambda \vec v\) for some \(\lambda\text{.}\) Since \(\mathcal{P}(\ell)=\ell\text{,}\) we know for any \(\vec v\in \ell\)
\begin{equation*} \mathcal{P}(\vec v)=1\vec v=\vec v. \end{equation*}
Therefore, any non-zero multiple of \(\mat{1\\1}\) is an eigenvector for \(\mathcal{P}\) with corresponding eigenvalue \(1\text{.}\)
By considering the null space of \(\mathcal{P}\text{,}\) we see, for example,
\begin{equation*} \mathcal{P}\mat{1\\-1}=\mat{0\\0}=0\mat{1\\-1}, \end{equation*}
and so \(\mat{1\\-1}\) and all its non-zero multiples are eigenvectors of \(\mathcal{P}\) with corresponding eigenvalue \(0\text{.}\)

Section 15.1 Finding Eigenvectors

Sometimes you can find the eigenvectors/values of a linear transformation just by thinking about it. For example, for reflections, projections, and dilations, the eigen directions are geometrically clear. However, for an arbitrary matrix transformation, it may not be obvious.
Our goal now will be to see if we can leverage linear algebra knowledge to find eigenvectors/values. So that we don’t have to switch back and forth between thinking about linear transformations and thinking about matrices, let’s just think about matrices for now.
Let \(M\) be a square matrix. The vector \(\vec v\neq \vec 0\) is an eigenvector for \(M\) if and only if there exists a scalar \(\lambda\) so that
\begin{equation} M\vec v=\lambda \vec v.\tag{15.1.1} \end{equation}
Put another way, \(\vec v\neq \vec 0\) is an eigenvector for \(M\) if and only if
\begin{equation*} M\vec v-\lambda \vec v=(M-\lambda I)\vec v=\vec 0. \end{equation*}
The middle equation provides a key insight. The operation \(\vec v\mapsto M\vec v-\lambda\vec v\) can be achieved by multiplying \(\vec v\) by the single matrix \(E_{\lambda}=M-\lambda I\text{.}\)
Now we have that \(\vec v\neq \vec 0\) is an eigenvector for \(M\) if and only if
\begin{equation*} E_{\lambda} \vec v=(M-\lambda I)\vec v = M\vec v-\lambda \vec v=\vec 0, \end{equation*}
or, phrased another way, \(\vec v\) is a non-zero vector satisfying \(\vec v\in \Null(E_{\lambda})\text{.}\)
We’ve reduced the problem of finding eigenvectors/values of \(M\) to finding the null space of \(E_{\lambda}\text{,}\) a related matrix.

Section 15.2 Characteristic Polynomial

Let \(M\) be an \(n\times n\) matrix and define \(E_{\lambda}=M-\lambda I\text{.}\) Every eigenvector for \(M\) must be in the null space of \(E_{\lambda}\) for some \(\lambda\text{.}\) However, because eigenvectors must be non-zero, the only chance we have of finding an eigenvector is if \(\Null(E_{\lambda})\neq \Set{\vec 0}\text{.}\) In other words, we would like to know when \(\Null(E_{\lambda})\) is non-trivial.
We’re well equipped to answer this question. Because \(E_{\lambda}\) is an \(n\times n\) matrix, we know \(E_{\lambda}\) has a non-trivial null space if and only if \(E_{\lambda}\) is not invertible which is true if and only if \(\det(E_{\lambda})=0\text{.}\) Every \(\lambda\) defines a different \(E_{\lambda}\) where eigenvectors could be hiding. By viewing \(\det(E_{\lambda})\) as a function of \(\lambda\), we can use our mathematical knowledge of single-variable functions to figure out when \(\det(E_{\lambda})=0\text{.}\)
The quantity \(\det(E_{\lambda})\text{,}\) viewed as a function of \(\lambda\text{,}\) has a special name—it’s called the characteristic polynomial
 1 
This time the term is traditionally given the English name, rather than being called the eigenpolynomial.
.

Definition 15.2.1. Characteristic Polynomial.

For a matrix \(A\text{,}\) the characteristic polynomial of \(A\) is
\begin{equation*} \chr(A)=\det(A-\lambda I). \end{equation*}

Example 15.2.2.

Find the characteristic polynomial of \(A=\mat{1&2\\3&4}\text{.}\)

Solution.

By the definition of the characteristic polynomial of A, we have
\begin{align*} \Char(A)&=\det(A-\lambda I)\\ &=\det\left(\mat{1&2\\3&4}-\mat{\lambda&0\\0&\lambda}\right)=\det\left(\mat{1-\lambda&2\\3&4-\lambda}\right)\\ &=(1-\lambda)(4-\lambda) - 6=\lambda^{2} -5\lambda-2. \end{align*}
For an \(n\times n\) matrix \(A\text{,}\) \(\Char(A)\) has some nice properties.
  • \(\Char(A)\) is a polynomial
     2 
    A priori, it’s not obvious that \(\det(A-\lambda I)\) should be a polynomial as opposed to some other type of function.
    .
  • \(\Char(A)\) has degree \(n\text{.}\)
  • The coefficient of the \(\lambda^{n}\) term in \(\Char(A)\) is \(\pm1\text{;}\) \(+1\) if \(n\) is even and \(-1\) if \(n\) is odd.
  • \(\Char(A)\) evaluated at \(\lambda = 0\) is \(\det(A)\text{.}\)
  • The roots of \(\Char(A)\) are precisely the eigenvalues of \(A\text{.}\)
We will just accept these properties as facts, but each of them can be proved with the tools we’ve developed.

Section 15.3 Using the Characteristic Polynomial to find Eigenvalues

With the characteristic polynomial in hand, finding eigenvectors/values becomes easier.

Example 15.3.1.

Find the eigenvectors/values of \(A=\mat{1&2\\3&2}\text{.}\)

Solution.

Like the previous example, we first compute \(\Char(A)\text{.}\)
\begin{align*} \Char(A)&=\det\left(\mat{1-\lambda&2\\3&2-\lambda}\right)\\ &=(1-\lambda)(2-\lambda)-6=\lambda^{2}-3\lambda-4=(4-\lambda)(-1-\lambda) \end{align*}
Next, we solve for when \(\chr(A)=0\) to find eigenvalues, which are \(\lambda_{1}=-1\) and \(\lambda_{2}=4\text{.}\)
We know non-zero vectors in \(\Null(A-\lambda_{1}I)\) are eigenvectors with eigenvalue \(-1\text{.}\) Computing,
\begin{equation*} \Null(A-\lambda_{1}I) = \Null\left(\mat{2&2\\3&3}\right)= \Span\Set{\mat{1\\-1}}, \end{equation*}
And so the eigenvectors of \(A\) corresponding to eigenvalue \(\lambda_{1}=-1\) are the non-zero multiples of \(\mat{1\\-1}\text{.}\)
Similarly, for \(\lambda_{2}=4\text{,}\) we compute
\begin{equation*} \Null(A-\lambda_{2}I) = \Null\left(\mat{-3&2\\3&-2}\right)= \Span\Set{\mat{2\\3}}, \end{equation*}
and so the eigenvectors for \(A\) with eigenvalue \(4\) are the non-zero multiples of \(\mat{2\\3}\text{.}\)
Using the characteristic polynomial, we can show that every eigenvalue for a matrix is a root of some polynomial (the characteristic polynomial). In general, finding roots of polynomials is a hard problem
 1 
In fact, numerically approximating eigenvalues turns out to be easier than finding roots of a polynomial, so many numerical root finding algorithms actually create a matrix with an appropriate characteristic polynomial and use numerical linear algebra to approximate its roots.
, and it’s not one we will focus on. However, it’s handy to have the quadratic formula in your back pocket for factoring particularly stubborn polynomials.

Example 15.3.2.

Find the eigenvectors/values of \(A=\mat{1&2\\3&4}\text{.}\)

Solution.

First, we find the roots of \(\Char(A)\) by setting it to \(0\text{.}\)
\begin{align*} \Char(A)&=\det\left(\mat{1-\lambda&2\\3&4-\lambda}\right)\\ &=(1-\lambda)(4-\lambda)-6=\lambda^{2}-5\lambda-2=0 \end{align*}
By the quadratic formula
 2 
Recall that the roots of \(ax^{2}+bx+c\) are given by \(\frac{-b\pm\sqrt{b^{2}-4ac}}{2a}\text{.}\)
, we find that
\begin{equation*} \lambda_{1}=\frac{5-\sqrt{33}}{2}\qquad\lambda_{2}=\frac{5+\sqrt{33}}{2} \end{equation*}
are the roots of \(\Char(A)\text{.}\)
Following the procedure outlined above, we need to find \(\Null(A-\lambda_{1}I)\) and \(\Null(A-\lambda_{2}I)\text{.}\)
We will start by row reducing \(A-\lambda_{1}I\text{.}\)
\begin{align*} \matc{1-\frac{5-\sqrt{33}}{2}&2\\3&4-\frac{5-\sqrt{33}}{2}}&\to \matc{\frac{-3+\sqrt{33}}{2}&2\\3&\frac{3+\sqrt{33}}{2}}\to \mat{1&\frac{4}{-3+\sqrt{33}}\\1&\frac{3+\sqrt{33}}{6}}\\ &\to \matc{1&\frac{4(3+\sqrt{33})}{(-3+\sqrt{33})(3+\sqrt{33})}\\1&\frac{3+\sqrt{33}}{6}}\to \mat{1&\frac{3+\sqrt{33}}{6}\\1&\frac{3+\sqrt{33}}{6}}\\ &\to \matc{1&\frac{3+\sqrt{33}}{6}\\0&0} \end{align*}
Thus, we conclude that the eigenvectors with eigenvalue \(\frac{5-\sqrt{33}}{2}\) are the non-zero multiples of \(\matc{\frac{3+\sqrt{33}}{6}\\-1}\text{.}\) Similarly, the eigenvectors with eigenvalue \(\frac{5+\sqrt{33}}{2}\) are the non-zero multiples of \(\matc{\frac{3-\sqrt{33}}{6}\\-1}\text{.}\)

Section 15.4 Transformations without Eigenvectors

Are there linear transformations without eigenvectors? Well, it depends on exactly what you mean. Let \(\mathcal{R}:\R^{2}\to\R^{2}\) be rotation counter-clockwise by \(90^{\circ}\text{.}\) Are there any non-zero vectors that don’t change direction when \(\mathcal{R}\) is applied? Certainly not.
Let’s examine further. We know \(M_{R}=\mat{0&-1\\1&0}\) is a matrix for \(\mathcal{R}\text{,}\) and
\begin{equation*} \Char(M_{R}) = \lambda^{2}+1. \end{equation*}
The polynomial \(\lambda^{2}+1\) has no real roots, which means that \(M_{R}\) (and \(\mathcal{R}\)) have no real eigenvalues. However, \(\lambda^{2}+1\) does have complex roots of \(\pm i\text{.}\) So far, we’ve always thought of scalars as real numbers, but if we allow complex numbers as scalars and view \(\mathcal{R}\) as a transformation from \(\mathbb{C}^{2}\to\mathbb{C}^{2}\text{,}\) it would have eigenvalues and eigenvectors.
Complex numbers play an invaluable role in advanced linear algebra and applications of linear algebra to physics. We will leave the following theorem as food for thought
 1 
The theorem is a direct corollary of the fundamental theorem of algebra.
.

Exercises 15.5 Exercises

1.

For each linear transformation defined below, find its eigenvectors and eigenvalues. If it has no eigenvectors/values, explain why not.
  1. \(\mathcal{S}:\R^{2}\to\R^{2}\text{,}\) where \(\mathcal{S}\) stretches every vector by the factor of \(3\text{.}\)
  2. \(\mathcal{R}:\R^{2}\to\R^{2}\text{,}\) where \(\mathcal{R}\) rotates every vector clockwise by \(\frac{\pi}{4}\text{.}\)
  3. \(\mathcal{P}:\R^{2}\to\R^{2}\text{,}\) where \(\mathcal{P}\) projects every vector onto the line \(\ell\) given by \(y=-x\text{.}\)
  4. \(\mathcal{F}:\R^{2}\to\R^{2}\text{,}\) where \(\mathcal{F}\) reflects every vector over the line \(\ell\) given by \(y=-x\text{.}\)
  5. \(T:\R^{3}\to\R^{3}\text{,}\) where \(T\) is a linear transformation induced by the matrix \(\mat{1&2&3\\3&4&5\\5&6&7}\text{.}\)
  6. \(U:\R^{3}\to\R^{2}\text{,}\) where \(U\) is a linear transformation induced by the matrix \(\mat{1&2&3\\3&4&5}\text{.}\)
Solution.
  1. Every non-zero vector in \(\R^{2}\) is an eigenvector with eigenvalue 3.
  2. \(\Char(R)\) has no real root, so \(R\) has no real eigenvalue or eigenvectors.
  3. There are two eigenvalues. \(0\) is an eigenvalue with eigenvector \(\mat{1 \\ 1}\text{,}\) and \(1\) is an eigenvalue with eigenvector \(\mat{1\\-1}\text{.}\)
  4. There are two eigenvalues. \(-1\) is an eigenvalue with eigenvector \(\mat{1\\1}\text{,}\) and \(1\) is an eigenvalue with eigenvector \(\mat{1 \\ -1}\text{.}\)
  5. \(\Char(T) = - \lambda ( \lambda^{2}- 12 \lambda - 12)\text{.}\) Then, we have three eigevalues. \(0\) is an eigenvalue with eigenvector \(\mat{1 \\ -2 \\ 1}\text{,}\) \(6 + 4 \sqrt{3}\) is an eigenvalue with eigenvector \(\mat{2 \\ 2 + \sqrt{3} \\ 2\sqrt{3}}\text{,}\) and \(6 - 4\sqrt{3}\) is an eigenvalue with eigenvector \(\mat{2 \\ 2 - \sqrt{3} \\ 2-2\sqrt{3}}\text{.}\)
  6. \(U\) is induced by a \(2 \times 3\) matrix, and eigenvalues/eigenvectors are only defined for linear maps from \(\R^{n}\) to itself. So, \(U\) has no eigenvalues or eigenvectors.

2.

Let \(A = \mat{a&b\\c&d}\text{,}\) where \(a,b,c,d \in \R\text{.}\)
  1. Find the characteristic polynomial of \(A\text{.}\)
  2. Find conditions on \(a,b,c,d\) so that \(A\) has (i) two distinct real eigenvalues, (ii) exactly one real eigenvalue, (iii) no real eigenvalues.
Solution.
  1. By definition,
    \begin{align*} \Char(A) = \det (A - \lambda I)&= \det \mat{a - \lambda & b \\ c & d - \lambda}\\ &= \lambda^{2}- (a + d)\lambda + ad - bc. \end{align*}
  2. By the quadratic formula, the discriminant \(\Delta\) of \(\Char(A)\) is \(\Delta = (a + d)^{2}- 4(ad - bc) = (a - d)^{2}+ 4 bc\text{.}\) So, \(A\) has two distinct real eigenvalues if \((a - d)^{2}+ 4bc > 0\text{,}\) one real eigenvalue if \((a - d)^{2}+ 4bc = 0\text{,}\) and no real eigenvalues if \((a - d)^{2}+ 4bc < 0.\)

3.

Let \(B=\mat{1&2\\0&4}\text{.}\)
  1. Find the eigenvalues of \(B\text{.}\)
  2. Find the eigenvalues of \(B^{T}\text{.}\)
  3. A vector \(\vec v\neq\vec 0\) is called a left-eigenvector for \(B\) if \(\vec vB=\lambda \vec v\) for some scalar \(\lambda\) (Here we consider \(\vec v\) a row vector). Find all left eigenvectors for \(B\text{.}\)
Solution.
  1. \(\Char(B) = \det\mat{1 -\lambda & 2 \\ 0 & 4 - \lambda}= (1 - \lambda)(4 - \lambda)\text{.}\) So, \(B\) has eigenvalues \(1\) and \(4\text{.}\)
  2. \(\Char(B) = \Char(B^{T})\text{,}\) so \(B^{T}\) also has eigenvalues \(1\) and \(4\text{.}\)
  3. The equation \(\vec{v}^{T}B = \lambda \vec{v}^{T}\) holds if and only if \(B^{T}\vec{v}= \lambda \vec{v}\text{.}\) Here, we interpret \(\vec{v}\) as a column vector and \(\vec{v}^{T}\) as a row vector.
    We observe that \(B^{T}\) has eigenvector \(\mat{-3 \\ 2}\) with eigenvalue \(1\) and an eigenvector \(\mat{0\\1}\) with eigenvalue \(4\text{.}\) Hence \(B\) has left eigenvectors \(\mat{-3 & 2}\) with eigenvalue \(1\) and left eigenvector \(\mat{0 & 1}\) with eigenvalue \(4\text{.}\) It follows that all non-zero scalar multiples of \(\mat{ -3 & 2 }\) and \(\mat{ 0 & 1 }\) are also left eigenvectors.

4.

For each statement below, determine whether it is true or false. Justify you answer.
  1. Zero cannot be an eigenvalue of any matrix.
  2. \(\vec 0\) cannot be an eigenvector of any matrix.
  3. A \(2\times 2\) matrix always has a real eigenvalue.
  4. A \(3\times 3\) matrix always has a real eigenvalue.
  5. A \(3\times 2\) matrix always has a real eigenvalue.
  6. The matrix \(M = \mat{3&3&3&3\\3&3&3&3\\3&3&3&3\\3&3&3&3}\) has exactly one eigenvalue.
  7. An invertible square matrix can never have zero as an eigenvalue.
  8. A non-invertible square matrix always has zero as an eigenvalue.
Solution.
  1. False. \(0\) is an eigenvalue of \(\mat{0 & 0 \\ 0 & 0}\text{.}\)
  2. True. An eigenvector is a nonzero vector by definition.
  3. False. \(\mat{0 & 1 \\ -1 & 0}\) has no real eigenvalue.
  4. True. Its characteristic polynomial has degree \(3\) and hence has at least one real root.
  5. False. Eigenvalues are not defined for a non-square matrix.
  6. False. \(\mat{3 & 3 & 3 & 3 \\ 3 & 3 & 3 & 3\\3 & 3 & 3 & 3\\3 & 3 & 3 & 3}\) has \(0\) as an eigenvalue with eigenvector \(\mat{1 \\ -1 \\ 0 \\ 0}\text{.}\) It also has eigenvalue \(12\) with eigenvector \(\mat{1 \\ 1 \\ 1 \\ 1}\text{.}\)
  7. True. Any eigenvector with eigenvalue \(0\) lies in the null space of the matrix, which implies that the null space has dimension at least one.
  8. True. Suppose \(A\) was a non-invertible square matrix. Then, we must have \(\Nullity(A)>0\text{,}\) and so \(\Null(A)\) contains at least one non-zero vector, \(\vec v\text{.}\) By definition \(A\vec v=\vec 0=0\vec v\text{,}\) and so \(\vec v\) is an eigenvector for \(A\) with eigenvalue \(0\text{.}\)