# matrix exponential

The exponential of a real valued square matrix $A$, denoted by $e^{A}$, is defined as

 $\displaystyle e^{A}$ $\displaystyle=$ $\displaystyle\sum_{k=0}^{\infty}\frac{1}{k!}A^{k}$ $\displaystyle=$ $\displaystyle I+A+\frac{1}{2}A^{2}+\cdots$

Let us check that $e^{A}$ is a real valued square matrix. Suppose $M$ is a real number such $|A_{ij}| for all entries $A_{ij}$ of $A$. Then $|(A^{2})_{ij}| for all entries in $A^{2}$, where $n$ is the order of $A$. (Alternatively, one could argue using matrix norms: We have $||e^{A}||\leq e^{||A||}$ for the 2-norm, and hence the entries of $e^{A}$ are bounded by $M=||e^{A}||$.) Thus, in general, we have $|(A^{k})_{i,j}|. Since $\sum_{k=0}^{\infty}\frac{n^{k}}{k!}M^{k+1}$ converges, we see that $e^{A}$ converges to real valued $n\times n$ matrix.

Example 1. Suppose $A$ is nilpotent, i.e., $A^{r}=0$ for some natural number $r$. Then

 $\displaystyle e^{A}$ $\displaystyle=$ $\displaystyle I+A+\frac{1}{2!}A^{2}+\cdots+\frac{1}{(r-1)!}A^{r-1}.$

Example 2. If $A$ is diagonalizable, i.e., of the form $A=LDL^{-1}$, where $D$ is a diagonal matrix, then

 $\displaystyle e^{A}$ $\displaystyle=$ $\displaystyle\sum_{k=0}^{\infty}\frac{1}{k!}(LDL^{-1})^{k}$ $\displaystyle=$ $\displaystyle\sum_{k=0}^{\infty}\frac{1}{k!}LD^{k}L^{-1}$ $\displaystyle=$ $\displaystyle Le^{D}L^{-1}.$

Further, if $D=\mathop{\mathrm{diag}}\{a_{1},\cdots,a_{n}\}$, then $D^{k}=\mathop{\mathrm{diag}}\{a_{1}^{k},\cdots,a_{n}^{k}\}$ whence

 $\displaystyle e^{A}$ $\displaystyle=$ $\displaystyle L\mathop{\mathrm{diag}}\{e^{a_{1}},\cdots,e^{a_{n}}\}L^{-1}.$

For diagonalizable matrix $A$, it follows that $\det e^{A}=e^{\mathop{\mathrm{tr}}A}$. However, this formula is, in fact, valid for all $A$.

Let $A$ be a square $n\times n$ real valued matrix. Then the matrix exponential satisfies the following properties

1. 1.

For the $n\times n$ zero matrix $O$, $e^{O}=I$, where $I$ is the $n\times n$ identity matrix.

2. 2.

If $A=L\mathop{\mathrm{diag}}\{a_{1},\cdots,a_{n}\}L^{-1}$ for an invertible $n\times n$ matrix $L$, then

 $e^{A}=L\mathop{\mathrm{diag}}\{e^{a_{1}},\cdots,e^{a_{n}}\}L^{-1}.$
3. 3.

If $A$ and $B$ commute, then $e^{A+B}=e^{A}e^{B}$.

4. 4.

The trace of $A$ and the determinant of $e^{A}$ are related by the formula

 $\det e^{A}=e^{\mathop{\mathrm{tr}}A}.$

In effect, $e^{A}$ is always invertible. The inverse is given by

 $(e^{A})^{-1}=e^{-A}.$
5. 5.

If $e^{A}$ is a rotational matrix, then $\mathop{\mathrm{tr}}A=0$.

A relevant example on property 3.
We report an interesting example where the cited property is valid. In the field of complex numbers consider the complex matrix

 $C=A+iB,$ (1)

being $C$ hermitian, i.e. $C^{\intercal}=\bar{C}$ (here ”$\intercal$” and overline ”$-$” stand for tranposition and conjugation, respectively) and orthogonal, i.e $C^{-1}=C^{\intercal}$ . From (1),

 $C^{\intercal}=A^{\intercal}+iB^{\intercal}.$

Since $C$ is orthogonal, from the complex equation $CC^{\intercal}=I$ ($I$ is the identity matrix), we have

 $CC^{\intercal}=(A+iB)(A^{\intercal}+iB^{\intercal})=(AA^{\intercal}-BB^{% \intercal})+i(BA^{\intercal}+AB^{\intercal})=I,$

whence the imaginary part leads to the equation

 $BA^{\intercal}+AB^{\intercal}=0.$ (2)

But $C$ is also hermitian, so that

 $C^{\intercal}=A^{\intercal}+iB^{\intercal}=\bar{C}=A-iB,$

therefore $A^{\intercal}=A$ is symmetric, and $B^{\intercal}=-B$ is skew-symmetric. From these and (2), $BA=AB$, and this implies that $\exp(A)\cdot\exp(B)=\exp(A+B)$. So that, the real and imaginary parts of an orthogonal and hermitian matrix verifies the property. Likewise, it is easy to show that if the complex matrix is symmetric and unitary, its real an imaginary components also verify this property.

Title matrix exponential MatrixExponential 2013-03-22 13:33:27 2013-03-22 13:33:27 mathcam (2727) mathcam (2727) 13 mathcam (2727) Definition msc 15A15 msc 15-00 ProofOfEquivalenceOfFormulasForExp