# orthogonal matrices

A real square $n\times n$ matrix $Q$ is orthogonal if $Q^{\mathrm{T}}Q=I$, i.e., if $Q^{-1}=Q^{\mathrm{T}}$. The rows and columns of an orthogonal matrix form an orthonormal basis.

Orthogonal matrices play a very important role in linear algebra. Inner products are preserved under an orthogonal transform: $(Qx)^{\mathrm{T}}Qy=x^{\mathrm{T}}Q^{\mathrm{T}}Qy=x^{\mathrm{T}}y$, and also the Euclidean norm $||Qx||_{2}=||x||_{2}$. An example of where this is useful is solving the least squares problem $Ax\approx b$ by solving the equivalent problem $Q^{\mathrm{T}}Ax\approx Q^{\mathrm{T}}b$.

Orthogonal matrices can be thought of as the real case of unitary matrices. A unitary matrix $U\in\mathbb{C}^{n\times n}$ has the property $U^{*}U=I$, where $U^{*}=\overline{U^{\mathrm{T}}}$ (the conjugate transpose). Since $\overline{Q^{\mathrm{T}}}=Q^{\mathrm{T}}$ for real $Q$, orthogonal matrices are unitary.

An orthogonal matrix $Q$ has $\det(Q)=\pm 1$.

Important orthogonal matrices are Givens rotations and Householder transformations. They help us maintain numerical stability because they do not amplify rounding errors.

Orthogonal $2\times 2$ matrices are rotations or reflections if they have the form:

 $\begin{pmatrix}\cos(\alpha)&\sin(\alpha)\\ -\sin(\alpha)&\cos(\alpha)\end{pmatrix}\text{or}\begin{pmatrix}\cos(\alpha)&% \sin(\alpha)\\ \sin(\alpha)&-\cos(\alpha)\end{pmatrix}$

respectively.

This entry is based on content from The Data Analysis Briefbook (http://rkb.home.cern.ch/rkb/titleA.htmlhttp://rkb.home.cern.ch/rkb/titleA.html)

## References

• 1 Friedberg, Insell, Spence. Linear Algebra. Prentice-Hall Inc., 1997.
Title orthogonal matrices OrthogonalMatrices 2013-03-22 12:05:19 2013-03-22 12:05:19 akrowne (2) akrowne (2) 11 akrowne (2) Definition msc 15-00 OrthogonalPolynomials RotationMatrix