Eigenvalues and Eigenvectors
Motivation
Many linear transformations have special directions that they scale but do not rotate. A vector pointing in such a direction is called an eigenvector, and the scaling factor is its eigenvalue (Trefethen and Bau 1997).
Understanding eigenvalues and eigenvectors is central to applied mathematics: they appear in principal component analysis, solving differential equations, computing matrix powers, and characterizing the long-run behavior of dynamical systems.
Definition
A linear map that leaves eigenvector directions invariant
Most vectors rotate or shear under a matrix, but an eigenvector only changes length and possibly sign.
Let \(A\) be an \(n \times n\) matrix. A nonzero vector \(\mathbf{v} \in \mathbb{R}^n\) is an eigenvector of \(A\) with eigenvalue \(\lambda \in \mathbb{R}\) if
\[ A\mathbf{v} = \lambda \mathbf{v}. \]
The equation says that multiplying \(\mathbf{v}\) by \(A\) produces the same result as scaling \(\mathbf{v}\) by \(\lambda\). The vector is not rotated — only stretched (if \(|\lambda| > 1\)), compressed (if \(|\lambda| < 1\)), or flipped (if \(\lambda < 0\)).
The requirement that \(\mathbf{v} \neq \mathbf{0}\) is necessary because \(A\mathbf{0} = \lambda\mathbf{0}\) holds trivially for any \(\lambda\) and carries no information.
Finding Eigenvalues
Rewrite the eigenvalue equation as
\[ (A - \lambda I)\mathbf{v} = \mathbf{0}. \]
This homogeneous system has a nonzero solution if and only if the matrix \(A - \lambda I\) is singular, i.e., its determinant is zero:
\[ \det(A - \lambda I) = 0. \]
This is the characteristic equation of \(A\). Expanding the determinant produces a degree-\(n\) polynomial in \(\lambda\) called the characteristic polynomial. Its roots are the eigenvalues of \(A\).
Finding Eigenvectors
Once an eigenvalue \(\lambda\) is known, the corresponding eigenvectors are the nonzero solutions to
\[ (A - \lambda I)\mathbf{v} = \mathbf{0}. \]
These form a subspace called the eigenspace of \(\lambda\). Any nonzero vector in this subspace is a valid eigenvector. The dimension of the eigenspace is the geometric multiplicity of \(\lambda\).
Example
Let
\[ A = \begin{pmatrix} 3 & 1 \\ 0 & 2 \end{pmatrix}. \]
Step 1: characteristic equation.
\[ \det(A - \lambda I) = \det\begin{pmatrix} 3 - \lambda & 1 \\ 0 & 2 - \lambda \end{pmatrix} = (3 - \lambda)(2 - \lambda) = 0. \]
The eigenvalues are \(\lambda_1 = 3\) and \(\lambda_2 = 2\).
Step 2: eigenvectors for \(\lambda_1 = 3\).
\[ A - 3I = \begin{pmatrix} 0 & 1 \\ 0 & -1 \end{pmatrix}. \]
The system \((A - 3I)\mathbf{v} = \mathbf{0}\) gives \(v_2 = 0\), so \(\mathbf{v}_1 = \begin{pmatrix}1\\0\end{pmatrix}\).
Step 3: eigenvectors for \(\lambda_2 = 2\).
\[ A - 2I = \begin{pmatrix} 1 & 1 \\ 0 & 0 \end{pmatrix}. \]
The system gives \(v_1 + v_2 = 0\), so \(\mathbf{v}_2 = \begin{pmatrix}1\\-1\end{pmatrix}\).
Verification.
\[ A\mathbf{v}_1 = \begin{pmatrix}3\\0\end{pmatrix} = 3\mathbf{v}_1, \qquad A\mathbf{v}_2 = \begin{pmatrix}2\\-2\end{pmatrix} = 2\mathbf{v}_2. \]
Key Properties
Eigenvalues of a triangular matrix are its diagonal entries. The example above illustrates this: \(A\) is upper triangular with diagonal \(3, 2\).
Trace and determinant. For an \(n \times n\) matrix with eigenvalues \(\lambda_1, \ldots, \lambda_n\):
\[ \operatorname{tr}(A) = \sum_{i=1}^n \lambda_i, \qquad \det(A) = \prod_{i=1}^n \lambda_i. \]
Linear independence. Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Diagonalization. If \(A\) has \(n\) linearly independent eigenvectors \(\mathbf{v}_1, \ldots, \mathbf{v}_n\) with eigenvalues \(\lambda_1, \ldots, \lambda_n\), then
\[ A = P D P^{-1}, \]
where \(P = [\mathbf{v}_1 \;\cdots\; \mathbf{v}_n]\) and \(D = \operatorname{diag}(\lambda_1, \ldots, \lambda_n)\). Diagonalization simplifies many computations, including computing \(A^k = P D^k P^{-1}\).