matrix representation of a linear transformation
Linear transformations and matrices are the two most fundamental notions in the study of linear algebra. The two concepts are intimately related. In this article, we will see how the two are related. We assume that all vector spaces^{} are finite dimensional and all vectors are written as column vectors^{}.
Linear transformations as matrices
Let $V,W$ be vector spaces (over a common field $k$) of dimension^{} $n$ and $m$ respectively. Fix bases $A=\{{v}_{1},\mathrm{\dots},{v}_{n}\}$ and $B=\{{w}_{1},\mathrm{\dots},{w}_{m}\}$ for $V$ and $W$ respectively. We shall order these bases so that $$ and $$ whenever $$. To distinguish an ordinary set from an ordered set, we shall adopt the notation $\u27e8{v}_{1},\mathrm{\dots},{v}_{n}\u27e9$ to mean the set $\{{v}_{1},\mathrm{\dots},{v}_{n}\}$ with ordering ${v}_{i}\le {v}_{j}$ whenever $i\le j$. The importance of ordering these bases will be apparent shortly.
For any linear transformation $T:V\to W$, we can write
$$T({v}_{j})=\sum _{i=1}^{m}{\alpha}_{ij}{w}_{i}$$ 
for each $j\in \{1,\mathrm{\dots},n\}$ and ${\alpha}_{ij}\in k$. We define the matrix associated with the linear transformation $T$ and ordered bases $A\mathrm{,}B$ by
$${[T]}_{B}^{A}:=({\alpha}_{ij}),$$ 
where $1\le i\le n$ and $1\le j\le m$. ${[T]}_{B}^{A}$ is a $m\times n$ matrix whose entries are in $k$. When $A=B$, we often write ${[T]}_{A}:={[T]}_{A}^{A}$. In addition^{}, when both ordered bases are standard bases ${E}_{n},{E}_{m}$ ordered in the obvious way, we write $[T]:={[T]}_{{E}_{m}}^{{E}_{n}}$.
Examples.

1.
Let $T:{\mathbb{R}}^{3}\to {\mathbb{R}}^{4}$ be given by
$$T\left(\begin{array}{c}\hfill x\hfill \\ \hfill y\hfill \\ \hfill z\hfill \end{array}\right)=\left(\begin{array}{c}\hfill x+2y+z\hfill \\ \hfill z\hfill \\ \hfill x+y5z\hfill \\ \hfill 3x+2z\hfill \end{array}\right).$$ Using the standard ordered bases
$${E}_{3}=\u27e8\left(\begin{array}{c}\hfill 1\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \end{array}\right),\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 0\hfill \end{array}\right),\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \end{array}\right)\u27e9\text{for}{\mathbb{R}}^{3}\mathit{\hspace{1em}}\text{and}\mathit{\hspace{1em}}{E}_{4}=\u27e8\left(\begin{array}{c}\hfill 1\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \end{array}\right),\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \end{array}\right),\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 0\hfill \end{array}\right),\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \end{array}\right)\u27e9\text{for}{\mathbb{R}}^{4}$$ ordered in the obvious way. Then,
$$T\left(\begin{array}{c}\hfill 1\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \end{array}\right)=\left(\begin{array}{c}\hfill 1\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 3\hfill \end{array}\right),T\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 0\hfill \end{array}\right)=\left(\begin{array}{c}\hfill 2\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \\ \hfill 0\hfill \end{array}\right),T\left(\begin{array}{c}\hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 1\hfill \end{array}\right)=\left(\begin{array}{c}\hfill 1\hfill \\ \hfill 1\hfill \\ \hfill 5\hfill \\ \hfill 2\hfill \end{array}\right),$$ so the matrix ${[T]}_{{E}_{4}}^{{E}_{3}}$ associated with $T$ and the standard ordered bases ${E}_{3}$ and ${E}_{4}$ is the $4\times 3$ matrix
$$\left(\begin{array}{ccc}\hfill 1\hfill & \hfill 2\hfill & \hfill 1\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 5\hfill \\ \hfill 3\hfill & \hfill 0\hfill & \hfill 2\hfill \end{array}\right).$$ 
2.
Let $T$ be the same linear transformation as above. However, let ${E}_{3}^{\prime}$ be the same basis as ${E}_{3}$ except that the order is reversed: $$. Then
$${[T]}_{{E}_{4}}^{{E}_{3}^{\prime}}=\left(\begin{array}{ccc}\hfill 1\hfill & \hfill 2\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 5\hfill & \hfill 1\hfill & \hfill 1\hfill \\ \hfill 2\hfill & \hfill 0\hfill & \hfill 3\hfill \end{array}\right).$$ Note that this matrix is just the matrix from the previous example except that the first and the last columns have been switched.

3.
Again, let $T$ be the same as before. Now, let ${E}_{4}^{\prime}$ be the ordered basis whose elements are those of ${E}_{4}$ but the order is now given by $$. Then
$${[T]}_{{E}_{4}^{\prime}}^{{E}_{3}^{\prime}}=\left(\begin{array}{ccc}\hfill 1\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 1\hfill & \hfill 2\hfill & \hfill 1\hfill \\ \hfill 2\hfill & \hfill 0\hfill & \hfill 3\hfill \\ \hfill 5\hfill & \hfill 1\hfill & \hfill 1\hfill \end{array}\right).$$ Note that this matrix is just the matrix from the previous example except that the first two rows and the last two rows have been interchanged.
Remarks.

•
From the examples above, we note several important features of a matrix representation of a linear transformation:

(a)
the matrix depends on the bases given to the vector spaces

(b)
the ordering of a basis is important

(c)
switching the order of a given basis amounts to switching columns and rows of the matrix, essentially multiplying a matrix by a permutation matrix^{}.

(a)

•
Some basic properties of matrix representations of linear transformations are

(a)
If $T:V\to W$ is a linear transformation, then ${[rT]}_{B}^{A}=r{[T]}_{B}^{A}$, where $A,B$ are ordered bases for $V,W$ respectively.

(b)
If $S,T:V\to W$ are linear transformations, then ${[S+T]}_{B}^{A}={[S]}_{B}^{A}+{[T]}_{B}^{A}$, where $A$ and $B$ are ordered bases for $V$ and $W$ respectively.

(c)
If $S:U\to V$ and $T:V\to W$, then ${[TS]}_{C}^{A}={[T]}_{C}^{B}{[S]}_{B}^{A}$, where $A,B,C$ are ordered bases for $U,V,W$ respectively.

(d)
As a result, $T$ is invertible^{} iff ${[T]}_{B}^{A}$ is an invertible matrix iff $dim(V)=dim(W)$.

(a)

•
We could have represented all vectors as row vectors. However, doing so would mean that the matrix representation ${M}_{1}$ of a linear transformation $T$ would be the transpose^{} of the matrix representation ${M}_{2}$ of $T$ if the vectors were represented as column vectors: ${M}_{1}={M}_{2}^{T}$, and that the application of the matrices to vectors would be from the right of the vectors:
$$\left(\begin{array}{ccc}\hfill a\hfill & \hfill b\hfill & \hfill c\hfill \end{array}\right)\left(\begin{array}{cccc}\hfill 1\hfill & \hfill 0\hfill & \hfill 1\hfill & \hfill 3\hfill \\ \hfill 2\hfill & \hfill 0\hfill & \hfill 1\hfill & \hfill 0\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 5\hfill & \hfill 2\hfill \end{array}\right)\mathit{\hspace{1em}\hspace{1em}}\text{instead of}\mathit{\hspace{1em}\hspace{1em}}\left(\begin{array}{ccc}\hfill 1\hfill & \hfill 2\hfill & \hfill 1\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 5\hfill \\ \hfill 3\hfill & \hfill 0\hfill & \hfill 2\hfill \end{array}\right)\left(\begin{array}{c}\hfill a\hfill \\ \hfill b\hfill \\ \hfill c\hfill \end{array}\right).$$
Matrices as linear transformations
Every $m\times n$ matrix $A$ over a field $k$ can be thought of as a linear transformation from ${k}^{n}$ to ${k}^{m}$ if we view each vector $v\in {k}^{n}$ as a $n\times 1$ matrix (a column) and the mapping is done by the matrix multiplication^{} $Av$, which is a $m\times 1$ matrix (a column vector in ${k}^{m}$). Specifically, we define ${T}_{A}:{k}^{n}\to {k}^{m}$ by
$${T}_{A}(v):=Av.$$ 
It is easy to see that ${T}_{A}$ is indeed a linear transformation. Furthermore, $[{T}_{A}]={[{T}_{A}]}_{{E}_{m}}^{{E}_{n}}=A$, since the representation of vectors as $n$tuples of elements in $k$ is the same as expressing each vector under the standard basis (ordered) in the vector space ${k}^{n}$. Below we list some of the basic properties:

1.
${T}_{rA}=r{T}_{A}$, for any $r\in k$,

2.
${T}_{A}+{T}_{B}={T}_{A+B}$, where $A,B$ are $m\times n$ matrices over $k$

3.
${T}_{A}\circ {T}_{B}={T}_{AB}$, where $A$ is an $m\times n$ matrix and $B$ is an $n\times p$ matrix over $k$

4.
${T}_{A}$ is invertible iff $A$ is an invertible matrix.
Remark. As we can see from the discussion above, if we fix sets of base elements for a vector space $V$ and $W$, there is a onetoone correspondence between the set of matrices (of the same size) over the underlying field $k$ and the set of linear transformations from $V$ to $W$.
References
 1 Friedberg, Insell, Spence. Linear Algebra. PrenticeHall Inc., 1997.
Title  matrix representation of a linear transformation 
Canonical name  MatrixRepresentationOfALinearTransformation 
Date of creation  20130322 17:29:59 
Last modified on  20130322 17:29:59 
Owner  CWoo (3771) 
Last modified by  CWoo (3771) 
Numerical id  15 
Author  CWoo (3771) 
Entry type  Definition 
Classification  msc 15A04 
Synonym  ordered bases 
Synonym  standard ordered bases 
Related topic  LinearTransformation 
Defines  ordered basis 
Defines  matrix representation 
Defines  standard ordered basis 