A matrix is simply a mapping M:A×BC of the product of two sets into some third set. As a rule, though, the word matrix and the notation associated with it are used only in connection with linear mappings. In such cases C is the ring or field of scalars.

Matrix of a linear mapping

Definition: Let V and W be finite-dimensional vector spacesMathworldPlanetmath over the same field k, with bases A and B respectively, and let f:VW be a linear mapping. For each aA let (kab)bB be the unique family of scalars (elements of k) such that


Then the family (Mab) (or equivalently the mapping (a,b)Mab of A×Bk) is called the matrix of f with respect to the given bases A and B. The scalars Mab are called the componentsMathworldPlanetmathPlanetmathPlanetmath of the matrix. The matrix M is said to be of size |A|-by-|B| or simply |A|x|B| matrix.

The matrix describes the functionMathworldPlanetmath f completely; for any element


of V, we have


as is readily verified.

Any two linear mappings VW have a sum, defined pointwise; it is easy to verify that the matrix of the sum is the sum, componentwise, of the two given matrices.

The formalism of matrices extends somewhat to linear mappings between modules, i.e. extends to a ring k, not necessarily commutativePlanetmathPlanetmathPlanetmath, rather than just a field.

Rows and columns; product of two matrices

Suppose we are given three modules V,W,X, with bases A,B,C respectively, and two linear mappings f:VW and g:WX. f and g have some matrices (Mab) and (Nbc) with respect to those bases. The product matrix NM is defined as the matrix (Pac) of the function


with respect to the bases A and C. Straight from the definitions of a linear mapping and a basis, one verifies that

Pac=bBMabNbc (1)

for all aA and cC.

To illustrate the notation of matrices in terms of rows and columns, suppose the spaces V,W,X have dimensions 2, 3, and 2 respectively, and bases

A={a1,a2}  B={b1,b2,b3}  C={c1,c2}.

We write


(Notice that we have taken a liberty with the notation, by writing e.g. M12 instead of Ma1a2.) The equation (1) shows that the multiplication of two matrices proceeds “rows by columns”. Also, in an expression such as N23, the first index refers to the row, and the second to the column, in which that component appears.

Similar notation can describe the calculation of f(x) whenever f is a linear mapping. For example, if f:VW is linear, and x=ixiai and f(x)=iyibi, we write


When, as above, a “row vectorMathworldPlanetmath” denotes an element of a space, a “column vector” denotes an element of the dual space. If, say, f¯:W*V* is the transposeMathworldPlanetmath of f, then, with respect to the bases dual to A and B, an equation f¯(jνjβj)=iμiαi may be written


One more illustration: Given a bilinear formPlanetmathPlanetmath L:V×Wk, we can denote L(v,w) by


A matrix M:A×BC is called square if A=B, or if some bijection AB is implicit in the context. (It is not enough for A and B to be equipotent.) Square matrices naturally arise in connection with a linear mapping of a space into itself (called an endomorphismPlanetmathPlanetmath), and in the related case of a change of basis (from one basis of some space, to another basis of the same space). When A is finite of cardinality n (and thus, so is B), then n is often called the order of the matrix M. Unfortunately, equally often order of M means the order ( of M as an element of the group GLn(C) (

Miscelleous usages of “matrix”

The word matrix has come into use in some areas where linear mappings are not at issue. An example would be a combinatorical statement, such as Hall’s marriage theoremMathworldPlanetmath, phrased in terms of “0-1 matrices” instead of subsets of A×B.


Matrices are heavily used in the physical sciences, engineering, statistics, and computer programming. But for purely mathematical purposes, they are less important than one might expect, and indeed are frequently irrelevant in linear algebra. Linear mappings, determinantsDlmfMathworldPlanetmath, traces, transposes, and a number of other simple notions can and should be defined without matrices, simply because they have a meaning independent of any basis or bases. Many little theorems in linear algebra can be proved in a simpler and more enlightening way without matrices than with them. One more illustration: The derivative (at a point) of a mapping from one surface to another is a linear mapping; it is not a matrix of partial derivativesMathworldPlanetmath, because the matrix depends on a choice of basis but the derivative does not.

Title matrix
Canonical name Matrix
Date of creation 2013-03-22 12:25:41
Last modified on 2013-03-22 12:25:41
Owner bbukh (348)
Last modified by bbukh (348)
Numerical id 19
Author bbukh (348)
Entry type Definition
Classification msc 15-01
Related topic LinearTransformation
Related topic ZeroMatrix
Defines size
Defines order