## You are here

Homematrix

## Primary tabs

# matrix

A matrix is simply a mapping $M\colon A\times B\to C$ of the product of two sets into some third set. As a rule, though, the word matrix and the notation associated with it are used only in connection with linear mappings. In such cases $C$ is the ring or field of scalars.

Matrix of a linear mapping

Definition: Let $V$ and $W$ be finite-dimensional vector spaces over the same field $k$, with bases $A$ and $B$ respectively, and let $f\colon V\to W$ be a linear mapping. For each $a\in A$ let $(k_{{ab}})_{{b\in B}}$ be the unique family of scalars (elements of $k$) such that

$f(a)=\sum_{{b\in B}}k_{{ab}}b\;.$ |

Then the family $(M_{{ab}})$ (or equivalently the mapping $(a,b)\mapsto M_{{ab}}$
of $A\times B\to k$)
is called the matrix of $f$ with respect to the given bases $A$ and $B$.
The scalars $M_{{ab}}$ are called the *components* of the matrix. The matrix $M$ is said to be of *size* $\lvert A\rvert$-by-$\lvert B\rvert$ or simply $\lvert A\rvert$x$\lvert B\rvert$ matrix.

The matrix describes the function $f$ completely; for any element

$x=\sum_{{a\in A}}x_{a}a$ |

of $V$, we have

$f(x)=\sum_{{a\in A}}M_{{ab}}b$ |

as is readily verified.

Any two linear mappings $V\to W$ have a sum, defined pointwise; it is easy to verify that the matrix of the sum is the sum, componentwise, of the two given matrices.

The formalism of matrices extends somewhat to linear mappings between
*modules*, i.e. extends to a ring $k$, not necessarily commutative,
rather than just a field.

Rows and columns; product of two matrices

Suppose we are given three modules $V,W,X$, with bases $A,B,C$ respectively,
and two linear mappings $f\colon V\to W$ and $g\colon W\to X$.
$f$ and $g$ have some matrices $(M_{{ab}})$ and $(N_{{bc}})$ with respect to
those bases. The *product* matrix $NM$ is defined as the matrix
$(P_{{ac}})$ of the function

$x\mapsto g(f(x))$ |

$V\to W$ |

with respect to the bases $A$ and $C$. Straight from the definitions of a linear mapping and a basis, one verifies that

$P_{{ac}}=\sum_{{b\in B}}M_{{ab}}N_{{bc}}$ | (1) |

for all $a\in A$ and $c\in C$.

To illustrate the notation of matrices in terms of rows and columns, suppose the spaces $V,W,X$ have dimensions 2, 3, and 2 respectively, and bases

$A=\{a_{1},a_{2}\}\qquad B=\{b_{1},b_{2},b_{3}\}\qquad C=\{c_{1},c_{2}\}\;.$ |

We write

$\begin{pmatrix}M_{{11}}&M_{{12}}&M_{{13}}\\ M_{{21}}&M_{{22}}&M_{{23}}\end{pmatrix}\begin{pmatrix}N_{{11}}&N_{{12}}\\ N_{{21}}&N_{{22}}\\ N_{{31}}&N_{{32}}\end{pmatrix}=\begin{pmatrix}P_{{11}}&P_{{12}}\\ P_{{21}}&P_{{22}}\end{pmatrix}\;.$ |

(Notice that we have taken a liberty with the notation, by writing e.g. $M_{{12}}$ instead of $M_{{a_{1}a_{2}}}$.) The equation (1) shows that the multiplication of two matrices proceeds “rows by columns”. Also, in an expression such as $N_{{23}}$, the first index refers to the row, and the second to the column, in which that component appears.

Similar notation can describe the calculation of $f(x)$ whenever $f$ is a linear mapping. For example, if $f\colon V\to W$ is linear, and $x=\sum_{i}x_{i}a_{i}$ and $f(x)=\sum_{i}y_{i}b_{i}$, we write

$\begin{pmatrix}x_{1}&x_{2}\end{pmatrix}\begin{pmatrix}M_{{11}}&M_{{12}}&M_{{13% }}\\ M_{{21}}&M_{{22}}&M_{{23}}\end{pmatrix}=\begin{pmatrix}y_{1}&y_{2}&y_{3}\\ \end{pmatrix}\;.$ |

When, as above, a “row vector” denotes an element of a space, a “column vector” denotes an element of the dual space. If, say, $\overline{f}\colon W^{*}\to V^{*}$ is the transpose of $f$, then, with respect to the bases dual to $A$ and $B$, an equation $\overline{f}(\sum_{j}\nu_{j}\beta_{j})=\sum_{i}\mu_{i}\alpha_{i}$ may be written

$\begin{pmatrix}\mu_{1}\\ \mu_{2}\end{pmatrix}=\begin{pmatrix}M_{{11}}&M_{{12}}&M_{{13}}\\ M_{{21}}&M_{{22}}&M_{{23}}\end{pmatrix}\begin{pmatrix}\nu_{1}\\ \nu_{2}\\ \nu_{3}\end{pmatrix}\;,$ |

One more illustration: Given a bilinear form $L\colon V\times W\to k$, we can denote $L(v,w)$ by

$\begin{pmatrix}v_{1}&v_{2}\end{pmatrix}\begin{pmatrix}L_{{11}}&L_{{12}}&L_{{13% }}\\ L_{{21}}&L_{{22}}&L_{{23}}\end{pmatrix}\begin{pmatrix}w_{1}\\ w_{2}\\ w_{3}\end{pmatrix}\;.$ |

Square matrix

A matrix $M\colon A\times B\to C$ is called square if $A=B$, or if some
bijection $A\to B$ is implicit in the context.
(It is not enough for $A$ and $B$ to be equipotent.)
Square matrices naturally arise in connection with a linear mapping of
a space into *itself* (called an endomorphism), and in the related
case of a change of basis (from one basis of some space, to another
basis of the same space). When $A$ is finite of cardinality $n$ (and thus, so is $B$), then $n$ is often called the *order* of the matrix $M$. Unfortunately, equally often order of $M$ means the order of $M$ as an element of the group $GL_{n}(C)$.

Miscelleous usages of “matrix”

The word matrix has come into use in some areas where linear mappings are not at issue. An example would be a combinatorical statement, such as Hall’s marriage theorem, phrased in terms of “0-1 matrices” instead of subsets of $A\times B$.

Remark

Matrices are heavily used in the physical sciences, engineering, statistics, and computer programming. But for purely mathematical purposes, they are less important than one might expect, and indeed are frequently irrelevant in linear algebra. Linear mappings, determinants, traces, transposes, and a number of other simple notions can and should be defined without matrices, simply because they have a meaning independent of any basis or bases. Many little theorems in linear algebra can be proved in a simpler and more enlightening way without matrices than with them. One more illustration: The derivative (at a point) of a mapping from one surface to another is a linear mapping; it is not a matrix of partial derivatives, because the matrix depends on a choice of basis but the derivative does not.

## Mathematics Subject Classification

15-01*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections

## Attached Articles

matrix operations by djao

derivative of matrix by matte

monomial matrix by GrafZahl

fully indecomposable matrix by Mathprof

direct sum of matrices by CWoo

generalized B\'ezout theorem on matrices by perucho

vectorization of matrix by pahio

matrix unit by CWoo

elementary matrix by CWoo

## Comments

## A little suggestion for the article "matrix"

Dear Mr. Hammick,

I think that matrices also are important like representation of tensors, specially tensors (either cartesian or generalizated)of second order as it is easy to appreciate in Physics and Continuum Mechanics. So, I believe that such fact could be included in its next version of "matrix".

Regards,

Pedro

## Re: A little suggestion for the article "matrix"

Yes, thanks. This was just a start on "matrix". The item does need considerable expansion.

Larry