Contents
raw book

A matrix \(\mathbf{A}\) transforms a vector \(\mathbf{x}\) into another vector \(\mathbf{Ax}\). In general, \(\mathbf{Ax}\) points in a different direction than \(\mathbf{x}\), since some sort of rotation may be part of the transformation.

However, special vectors called eigenvectors keep their direction under a transformation with \(\mathbf{A}\). In this case, \(\mathbf{Ax}\) and \(\mathbf{x}\) are parallel, differing only in a constant length factor, which we call the associated eigenvalue.

Definition

For a square matrix \(\mathbf{A}\), a to be found nonzero vector \(\mathbf{v}\) and a complex or real scalar \(\lambda\) are eigenvectors and associated eigenvalues, iff they satisfy

\[\mathbf{Av}=\lambda\mathbf{v}\]

There are infinetely many solutions, since \(\frac{\lambda}{c}\) with \(c\mathbf{v}\) for any \(c\neq 0\) also satisfies this equation. Consequently, eigenvectors are assumed to be normalized, i.e., satisfy the constraint \(\mathbf{v}^T\mathbf{v}=1\).

The original equation can be re-arranged a bit:

\[\begin{array}{rrl} &\mathbf{Av}&=\lambda\mathbf{v}\\ \Leftrightarrow&\mathbf{Av} - \lambda\mathbf{v}&=\mathbf{0}\\ \Leftrightarrow&(\mathbf{A} - \lambda\mathbf{I})\mathbf{v}&=\mathbf{0}\\ \end{array}\]

To find the roots, that are the eigenvalues \(\lambda_i\) of the equation, we use the determinant:

\[\det(\mathbf{A} - \lambda\mathbf{I}) = 0\]

The Characteristic Polynomial

Properties

\[\text{Tr}(\mathbf{A})=\sum\limits_{i=1}^n\lambda_i\]

\[\det(\mathbf{A})=\prod\limits_{i=1}^n\lambda_i\]

Proof: \(\mathbf{Av}_i=\lambda_i\mathbf{v}_i\) implies that \((c\mathbf{A})\mathbf{v}_i=(c\lambda_i)\mathbf{v}_i\)

Proof: \((\mathbf{A}+c\mathbf{I})\mathbf{v}_i=\mathbf{Av}_i+c\mathbf{v}_i=\lambda\mathbf{v}_i+c\mathbf{v}_i=(\lambda_i+c)\mathbf{v}_i\)

Proof:

\[\begin{array}{rrl} &\mathbf{Av}_i&=\lambda_i\mathbf{v}_i\\ \Leftrightarrow&\mathbf{A}^{-1}\mathbf{Av}_i&=\mathbf{A}^{-1}\lambda_i\mathbf{v}_i\\ \Leftrightarrow&\mathbf{v}_i&=\lambda_i\mathbf{A}^{-1}\mathbf{v}_i\\ \Leftrightarrow&\mathbf{A}^{-1}\mathbf{v}_i&=\frac{1}{\lambda_i}\mathbf{v}_i\quad\square\\ \end{array}\]

Eigendecomposition of a matrix

If \(\mathbf{A}\) is a \(n\times n\) diagonalizable matrix with \(n\) linearly independent eigenvectors \(q_i\) (for \(i=1,..., n\)), then \(\mathbf{A}\) can be factorized as

\[\mathbf{A} = \mathbf{Q}\mathbf{\Lambda}\mathbf{Q}^{-1}\]

where \(\mathbf{Q}\) is a \(n\times n\) matrix whose ith column is the eigenvector \(\mathbf{v}_i\) of \(\mathbf{A}\) and \(\mathbf{\Lambda}\) is the diagonal matrix whose diagonal elements are the corresponding eigenvalues, \({\Lambda}_{ii}=\lambda_i\). The decomposition can directly be derived from the initial statement about eigenvalues and eigenvectors:

\[\begin{array}{rrl} &\mathbf{A} \mathbf{v} &= \lambda \mathbf{v} \\ \Leftrightarrow&\mathbf{A} \mathbf{Q} &= \mathbf{Q} \mathbf{\Lambda} \\ \Leftrightarrow&\mathbf{A} &= \mathbf{Q}\mathbf{\Lambda}\mathbf{Q}^{-1} . \end{array}\]

Applications of Eigenvalues and Eigenvectors