# eigenvalue

Let $V$ be a vector space^{} over a field $k$, and let $A$ be an
endomorphism of $V$ (meaning a linear mapping of $V$ into itself).
A scalar $\lambda \in k$ is said to be an
*eigenvalue ^{}* of $A$ if there is a nonzero $x\in V$ for which

$$Ax=\lambda x.$$ | (1) |

Geometrically, one thinks of a vector whose direction is unchanged by the action of $A$, but whose magnitude is multiplied by $\lambda $.

If $V$ is finite dimensional, elementary linear algebra shows that
there are several equivalent^{} definitions of an eigenvalue:

(3) $B$ is not injective^{}.

(4) $B$ is not surjective.

(5) $det(B)=0$, i.e. $det(\lambda I-A)=0$.

But if $V$ is of infinite^{} dimension^{}, (5) has no meaning and the
conditions (2) and (4) are not equivalent to (1).
A scalar $\lambda $ satisfying (2) (called a *spectral value* of
$A$) need not be an eigenvalue. Consider for example the complex
vector space $V$ of all sequences^{} ${({x}_{n})}_{n=1}^{\mathrm{\infty}}$ of complex
numbers^{} with the obvious operations^{}, and the map $A:V\to V$ given by

$$A({x}_{1},{x}_{2},{x}_{3},\mathrm{\dots})=(0,{x}_{1},{x}_{2},{x}_{3},\mathrm{\dots}).$$ |

Zero is a spectral value of $A$, but clearly not an eigenvalue.

Now suppose again that $V$ is of finite dimension, say $n$. The function

$$\chi (\lambda )=det(B)$$ |

is a polynomial^{} of degree $n$ over $k$ in the
variable $\lambda $, called the *characteristic polynomial ^{}* of the
endomorphism $A$. (Note that some writers define the characteristic
polynomial as $det(A-\lambda I)$ rather than $det(\lambda I-A)$, but the
two have the same zeros.)

If $k$ is $\u2102$ or any other algebraically closed field, or if $k=\mathbb{R}$ and $n$ is odd, then $\chi $ has at least one zero, meaning that $A$ has at least one eigenvalue. In no case does $A$ have more than $n$ eigenvalues.

Although we didn’t need to do so here, one can compute the coefficients
of $\chi $ by introducing a basis of $V$ and the corresponding matrix for
$B$. Unfortunately, computing $n\times n$ determinants^{} and finding roots
of polynomials of degree $n$ are computationally messy procedures
for even moderately large $n$, so for most practical purposes
variations on this naive scheme are needed. See the eigenvalue
problem for more information.

If $k=\u2102$ but the coefficients of $\chi $ are real (and in particular if
$V$ has a basis for which the matrix of $A$ has only real entries), then
the non-real eigenvalues of $A$ appear in conjugate^{} pairs. For example,
if $n=2$ and, for some basis, $A$ has the matrix

$$A=\left(\begin{array}{cc}\hfill 0\hfill & \hfill -1\hfill \\ \hfill 1\hfill & \hfill 0\hfill \end{array}\right)$$ |

then $\chi (\lambda )={\lambda}^{2}+1$, with the two zeros $\pm i$.

Eigenvalues are of relatively little importance in connection with
an infinite-dimensional vector space, unless that space is endowed with
some additional structure^{}, typically that of a Banach space^{} or Hilbert space^{}. But in those cases the notion is of great value in
physics, engineering, and mathematics proper. Look for “spectral theory”
for more on that subject.

Title | eigenvalue |

Canonical name | Eigenvalue |

Date of creation | 2013-03-22 12:11:52 |

Last modified on | 2013-03-22 12:11:52 |

Owner | Koro (127) |

Last modified by | Koro (127) |

Numerical id | 15 |

Author | Koro (127) |

Entry type | Definition |

Classification | msc 15A18 |

Related topic | EigenvalueProblem |

Related topic | SimilarMatrix |

Related topic | Eigenvector^{} |

Related topic | SingularValueDecomposition |

Defines | eigenvalue |

Defines | spectral value |