One can think about decomposing a vector into two separate pieces of information: its direction and its length. This idea can be extended to matrices as well, and this is the idea behind eigenvalues and eigenvectors, which are defined according to this equation:
\[\tag{$\dagger$} \A\v = \lambda\v.\]Any vector (of unit length) $\v$ which satisfies this equation is special to $\A$: when $\A$ operates in that direction, it acts merely to elongate or shrink (and not rotate, thereby preserving direction), and the amount which it elongates or shrinks is $\lambda$. The $\v$’s which satisfy this equation are called eigenvectors, and the $\lambda$’s are the eigenvalues.
Note: We deal here only with symmetric matrices; it is possible to extend the concept to non-symmetric and non-square matrices via the spectral decomposition theorem.
Below are a number of helpful facts about eigenvalues.
- Number of solutions
- Spectral radius
- Characteristic equation
- Eigenvalues and traces
- Eigenvalues and determinants
- Positive definite matrices
- Eigendecomposition
- Inverses
- Rank
- Eigenvalues of idempotent matrices
- Extremal values
Number of solutions
Suppose $\A$ is an $n \times n$ matrix. Then it has exactly $n$ solutions to ($\dagger$). The $n$ eigenvalues are called the spectrum of $\A$.
Spectral radius
The largest eigenvalue of $\A$ is called the spectral radius of $\A$, often denoted $\kappa(\A)$. It follows that $\lim_{k \to \infty} \A^k = \zero$ if and only if $\kappa(\A) < 1$.
Characteristic equation
Any value $\lam$ satisfying ($\dagger$) also satisfies
\[\abs{\A-\lam\I} = 0\]and vice versa. This equation is called the characteristic equation.
Eigenvalues and traces
If $\A$ has eigenvalues ${\lam_1,\ldots,\lam_n}$,
\[\text{tr}(\A)=\sum_{i=1}^n\lam_i.\]Eigenvalues and determinants
If $\A$ has eigenvalues ${\lam_1,\ldots,\lam_n}$,
\[\al{det}{\abs{\A}=\prod_{i=1}^n\lam_i.}\]Positive definite matrices
\[\A \href{positive-definite.html}{\text{ positive definite}} \Leftrightarrow \text{all eigenvalues of $\A$ are positive}.\]Similarly,
\[\A \href{positive-definite.html}{\text{ positive semidefinite}} \Leftrightarrow \text{all eigenvalues of $\A$ are nonnegative}.\]Eigendecomposition
If $\A$ is a symmetric matrix, its eigenvectors are orthonormal. Such a matrix can thus be factored into:
\[\al{decomp}{\A=\Q\bL\Q^T,}\]where $\bL$ is a diagonal matrix containing the eigenvalues of $\A$, and the columns of $\Q$ contain its orthonormal eigenvectors (i.e., $\Q \Q \Tr = \Q \Tr \Q = \I$). This is also known as the spectral decomposition of $\A$.
Inverses
Suppose $\A$ has eigenvectors $\Q$ and eigenvalues ${\lam_i}$. Then $\A^{-1}$ has eigenvectors $\Q$ and eigenvalues ${\lam_i^{-1}}$. In other words, if $\A=\Q\bL\Q^T$,
\[\al{inv}{\A^{-1}=\Q\bL^{-1}\Q^T.}\]Rank
Suppose $\A$ has rank $r$. Then $\A$ has $r$ nonzero eigenvalues, and the remaining $n-r$ eigenvalues are equal to zero.
Eigenvalues of idempotent matrices
If $\A$ is idempotent, all its eigenvalues are equal to either 0 or 1.
Extremal values
Let the eigenvalues ${\lam_1,\ldots,\lam_n}$ of $\A$ be ordered from smallest to largest. Over the set of all vectors $\x$ such that $\norm{\x}_2 = 1$,
\[\min \x^T\A\x = \lam_1, \qquad \max \x^T\A\x = \lam_n.\]