## You are here

Homediagonalization

## Primary tabs

# diagonalization

Let $V$ be a finite-dimensional linear space over a field $K$, and $T:V\rightarrow V$ a linear transformation. To diagonalize $T$ is to find a basis of $V$ that consists of eigenvectors. The transformation is called diagonalizable if such a basis exists. The choice of terminology reflects the fact that the matrix of a linear transformation relative to a given basis is diagonal if and only if that basis consists of eigenvectors.

Next, we give necessary and sufficient conditions for $T$ to be diagonalizable. For $\lambda\in K$ set

$E_{\lambda}=\{u\in V:Tu=\lambda u\}.$ |

It isn’t hard to show that $E_{\lambda}$ is a subspace of $V$, and that this subspace is non-trivial if and only if $\lambda$ is an eigenvalue of $T$. In that case, $E_{\lambda}$ is called the eigenspace associated to $\lambda$.

###### Proposition 1.

A transformation is diagonalizable if and only if

$\dim V=\sum_{\lambda}\dim E_{\lambda},$ |

where the sum is taken over all eigenvalues of the transformation.

# The Matrix Approach.

As was already mentioned, the term “diagonalize” comes from a matrix-based perspective. Let $M$ be a matrix representation of $T$ relative to some basis $B$. Let

$P=[v_{1},\ldots,v_{n}],\quad n=\dim V,$ |

be a matrix whose column vectors are eigenvectors expressed relative to $B$. Thus,

$Mv_{i}=\lambda_{i}v_{i},\quad i=1,\ldots,n$ |

where $\lambda_{i}$ is the eigenvalue associated to $v_{i}$. The above $n$ equations are more succinctly as the matrix equation

$MP=PD,$ |

where $D$ is the diagonal matrix with $\lambda_{i}$ in the $i$-th position. Now the eigenvectors in question form a basis, if and only if $P$ is invertible. In that case, we may write

$M=PDP^{{-1}}.$ | (1) |

Thus in the matrix-based approach, to “diagonalize” a matrix $M$ is to find an invertible matrix $P$ and a diagonal matrix $D$ such that equation (1) is satisfied.

# Subtleties.

There are two fundamental reasons why a transformation $T$ can fail to be diagonalizable.

1. The characteristic polynomial of $T$ does not factor into linear factors over $K$.

2. There exists an eigenvalue $\lambda$, such that the kernel of $(T-\lambda I)^{2}$ is strictly greater than the kernel of $(T-\lambda I)$. Equivalently, there exists an invariant subspace where $T$ acts as a nilpotent transformation plus some multiple of the identity. Such subspaces manifest as non-trivial Jordan blocks in the Jordan canonical form of the transformation.

## Mathematics Subject Classification

15-00*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff

## Recent Activity

new question: Prove a formula is part of the Gentzen System by LadyAnne

Mar 30

new question: A problem about Euler's totient function by mbhatia

new problem: Problem: Show that phi(a^n-1), (where phi is the Euler totient function), is divisible by n for any natural number n and any natural number a >1. by mbhatia

new problem: MSC browser just displays "No articles found. Up to ." by jaimeglz

Mar 26

new correction: Misspelled name by DavidSteinsaltz

Mar 21

new correction: underline-typo by Filipe

Mar 19

new correction: cocycle pro cocyle by pahio

Mar 7

new image: plot W(t) = P(waiting time <= t) (2nd attempt) by robert_dodier

new image: expected waiting time by robert_dodier

new image: plot W(t) = P(waiting time <= t) by robert_dodier

## Attached Articles

## Corrections

more on matrices by drini ✓

matrices by matte ✘

Missing "Also defines" by GrafZahl ✓

typo by yark ✓