# simultaneous block-diagonalization of upper triangular commuting matrices

Let $\mathbf{e}_{i}$ denote the (column) vector whose $i$th position is $1$ and where all other positions are $0$. Denote by $[n]$ the set $\{1,\ldots,n\}$. Denote by $\mathrm{M}_{n}(\mathcal{K})$ the set of all $n\times n$ matrices over $\mathcal{K}$, and by $\mathrm{GL}_{n}(\mathcal{K})$ the set of all invertible elements of $\mathrm{M}_{n}(\mathcal{K})$. Let $d_{i}$ be the function which extracts the $i$th diagonal element of a matrix, i.e., $d_{i}(A)=\mathbf{e}_{i}^{\mathrm{T}}\!A\mathbf{e}_{i}$.

###### Theorem 1.

Let $\mathcal{K}$ be a field, let $n$ be a positive integer, and let $\sim$ be an equivalence relation on $[n]$ such that if $i\sim j$ and $i\leqslant k\leqslant j$ then $k\sim i$. Let $A_{1},\ldots,A_{r}\in\mathrm{M}_{n}(\mathcal{K})$ be pairwise commuting upper triangular matrices. If these matrices and $\sim$ are related such that

 $i\sim j\quad\text{if and only if}\quad d_{i}(A_{k})=d_{j}(A_{k})\text{ for all% }k\in[r]\text{,}$

then there exists a matrix $B\in\mathrm{GL}_{n}(\mathcal{K})$ such that:

1. 1.

If $\mathbf{e}_{i}^{\mathrm{T}}\!B^{-1}A_{k}B\mathbf{e}_{j}\neq 0$ then $i\sim j$ and $i\leqslant j$.

2. 2.

If $i\sim j$ then $\mathbf{e}_{i}^{\mathrm{T}}\!B^{-1}A_{k}B\mathbf{e}_{j}=\mathbf{e}_{i}^{% \mathrm{T}}A_{k}\mathbf{e}_{j}$.

Condition 1 says that if an element of $B^{-1}A_{k}B$ is nonzero then both its row and column indices must belong to the same equivalence class of $\sim$, i.e., the nonzero elements of $B^{-1}A_{k}B$ only occur in particular blocks (http://planetmath.org/PartitionedMatrix) along the diagonal, and these blocks correspond to equivalence classes of $\sim$. Condition 2 says that within one of these blocks, $B^{-1}A_{k}B$ is equal to $A_{k}$.

The proof of the theorem requires the following lemma.

###### Lemma 2.

Let a sequence $A_{1},\ldots,A_{r}\in\mathrm{M}_{n}(\mathcal{K})$ of upper triangular matrices be given, and denote by $\mathcal{A}$ the unital (http://planetmath.org/unity) algebra (http://planetmath.org/Algebra) generated by these matrices. For every sequence $\lambda_{1},\ldots,\lambda_{r}\in\mathcal{K}$ of scalars there exists a matrix $C\in\mathcal{A}$ such that

 $d_{i}(C)=\begin{cases}1&\text{if $$d_{i}(A_{k})=\lambda_{k}$$ for all $$k\in[r% ]$$,}\\ 0&\text{otherwise}\end{cases}$

for all $i\in[n]$.

The proof of that lemma can be found in this article (http://planetmath.org/CharacteristicMatrixOfDiagonalElementCrossSection).

###### Proof of theorem.

The proof is by induction on the number of equivalence classes of $\sim$. If there is only one equivalence class then one can take $B=I$.

If there is more than one equivalence class, then let $S$ be the equivalence class that contains $n$. By Lemma 2 there exists a matrix $C$ in the unital algebra generated by $A_{1},\ldots,A_{r}$ (hence necessarily upper triangular) such that $d_{i}(C)=1$ for all $i\in S$ and $d_{i}(C)=0$ for all $i\in[n]\setminus S$. Thus $C$ has a

 $C=\begin{pmatrix}C_{11}&C_{12}\\ 0&C_{22}\end{pmatrix}$

where $C_{22}$ is a $\lvert S\rvert\times\lvert S\rvert$ matrix that has all $1$s on the diagonal, and $C_{11}$ is a $\bigl{(}n-\nobreak\lvert S\rvert\bigr{)}\times\bigl{(}n-\nobreak\lvert S\rvert% \bigr{)}$ matrix that has all $0$s on the diagonal.

Let $k\in[r]$ be arbitrary and similarly decompose

 $\displaystyle C^{n}=$ $\displaystyle\begin{pmatrix}D_{11}&D_{12}\\ 0&D_{22}\end{pmatrix}\text{,}$ $\displaystyle A_{k}=$ $\displaystyle\begin{pmatrix}A_{11}&A_{12}\\ 0&A_{22}\end{pmatrix}\text{.}$

One can identify $D_{11}=(C_{11})^{n}$ and $D_{22}=(C_{22})^{n}$, but due to the zero diagonal of $C_{11}$ and the fact that the of these matrices are smaller than $n$, the more striking equality $D_{11}=0$ also holds. As for $D_{22}$, one may conclude that it is invertible.

Since the algebra that $C^{n}$ belongs to was generated by pairwise commuting elements, it is a commutative (http://planetmath.org/Commutative) algebra, and in particular $C^{n}A_{k}=A_{k}C^{n}$. In of the individual blocks, this becomes

 $\begin{pmatrix}0&D_{12}A_{22}\\ 0&D_{22}A_{22}\end{pmatrix}=\begin{pmatrix}0&A_{11}D_{12}+A_{12}D_{22}\\ 0&A_{22}D_{22}\end{pmatrix}\text{.}$

Now let

 $D=\begin{pmatrix}I&D_{12}\\ 0&D_{22}\end{pmatrix}\text{, so that }D^{-1}=\begin{pmatrix}I&-D_{12}D_{22}^{-% 1}\\ 0&D_{22}^{-1}\end{pmatrix}$

and consider the matrix $D^{-1}A_{k}D$. Clearly

 $\displaystyle D^{-1}A_{k}D=D^{-1}\begin{pmatrix}A_{11}&A_{12}\\ 0&A_{22}\end{pmatrix}\begin{pmatrix}I&D_{12}\\ 0&D_{22}\end{pmatrix}=\\ \displaystyle=D^{-1}\begin{pmatrix}A_{11}&A_{11}D_{12}+A_{12}D_{22}\\ 0&A_{22}D_{22}\end{pmatrix}=\begin{pmatrix}I&-D_{12}D_{22}^{-1}\\ 0&D_{22}^{-1}\end{pmatrix}\begin{pmatrix}A_{11}&D_{12}A_{22}\\ 0&D_{22}A_{22}\end{pmatrix}=\\ \displaystyle=\begin{pmatrix}A_{11}&D_{12}A_{22}-D_{12}D_{22}^{-1}D_{22}A_{22}% \\ 0&D_{22}^{-1}D_{22}A_{22}\end{pmatrix}=\begin{pmatrix}A_{11}&0\\ 0&A_{22}\end{pmatrix}$

so that the positions with row not in $S$ and column in $S$ are all zero, as requested for $B^{-1}A_{k}B$. It should be observed that the choice of $D$ is independent of $k$, and that the same $D$ thus works for all the $A_{k}$.

In order to complete the proof, one applies the induction hypothesis to the restriction of $\sim$ to $[n]\setminus S$ and the corresponding submatrices of $D^{-1}A_{k}D$, which satisfy the same conditions but have one equivalence class less. This produces a block-diagonalising matrix $B^{\prime}$ for these submatrices, and thus the sought $B$ can be constructed as $D\left(\begin{smallmatrix}B^{\prime}&0\\ 0&I\end{smallmatrix}\right)$. ∎

Title simultaneous block-diagonalization of upper triangular commuting matrices SimultaneousBlockdiagonalizationOfUpperTriangularCommutingMatrices 2013-03-22 15:29:42 2013-03-22 15:29:42 lars_h (9802) lars_h (9802) 5 lars_h (9802) Theorem msc 15A21