# tensor

## Overview

A *tensor* is the mathematical idealization of a geometric or
physical quantity whose analytic description, relative to a fixed
frame of reference, consists of an array of numbers^{1}^{1}http://aux.planetmath.org/files/objects/3112/tensor-pipe.jpg“Ceci
n’est pas une pipe,” as Rene Magritte put it. The image and the
object represented by the image are not the same thing. The mass of
a stone is not a number. Rather the mass can be described by a
number relative to some specified unit mass.. Some well known
examples of tensors in geometry are quadratic forms^{}, and the curvature
tensor. Examples of physical tensors are the energy-momentum tensor,
and the polarization tensor.

Geometric and physical quantities may be categorized by considering
the degrees of freedom inherent in their description. The scalar
quantities are those that can be represented by a single number —
speed, mass, temperature, for example. There are also vector-like
quantities, such as force, that require a list of numbers for their
description. Finally, quantities such as quadratic forms
naturally require a multiply indexed array for their description.
These latter quantities can only be conceived of as *tensors*.

Actually, the tensor notion is quite general, and applies to all of
the above examples; scalars and vectors are special kinds of
tensors. The feature that distinguishes a scalar from a vector, and
distinguishes both of those from a more general tensor quantity is
the number of indices in the representing array. This number is
called the *rank* of a tensor. Thus, scalars are rank zero tensors (no
indices at all), and vectors are rank one tensors.

It is also necessary to distinguish between two types of indices,
depending on whether the corresponding numbers transform covariantly
or contravariantly relative to a change in the frame of reference.
*Contravariant indices* are written as superscripts, while the
*covariant indices* are written as subscripts. The *valence ^{}*
of a tensor is the pair $(p,q)$, where $p$ is the number contravariant
and $q$ the number of covariant indices, respectively.

It is customary to represent the actual tensor, as a stand-alone
entity, by a bold-face symbol such as $\U0001d5a0$. The corresponding array
of numbers for a type $(p,q)$ tensor is denoted by the symbol
${A}_{{j}_{1}\mathrm{\dots}{j}_{q}}^{{i}_{1}\mathrm{\dots}{i}_{p}},$ where the superscripts and
subscripts are indices that vary from $1$ to $n$. This number $n$, the
range of the indices, is called the dimension^{} of the tensor. The
total degrees of freedom required for the specification of a
particular tensor is the product^{} of the tensor’s rank and its dimension.

Again, it must be emphasized that the tensor $\U0001d5a0$ and the representing array ${A}_{{j}_{1}\mathrm{\dots}{j}_{p}}^{{i}_{1}\mathrm{\dots}{i}_{q}}$ are not the same thing. The values of the representing array are given relative to some frame of reference, and undergo a linear transformation when the frame is changed.

Finally, it must be mentioned that most physical and geometric
applications are concerned with *tensor fields*, that is to say
tensor valued functions, rather than tensors themselves. Some care is
required, because it is common to see a tensor field called simply a
tensor. There is a difference^{}, however; the entries of a tensor array
${A}_{{j}_{1}\mathrm{\dots}{j}_{p}}^{{i}_{1}\mathrm{\dots}{i}_{q}}$ are numbers, whereas the entries
of a tensor field are functions. The present entry treats the purely
algebraic^{} aspect of tensors. Tensor field concepts, which typically
involved derivatives of some kind, are discussed elsewhere.

## Definition.

The formal definition of a tensor quantity begins with a
finite-dimensional vector space^{} $U$, which furnishes the uniform
“building blocks” for tensors of all valences. In typical
applications, $U$ is the tangent space at a point of a manifold; the
elements of $U$ represent velocities and forces. The space of
$(p,q)$-valent tensors, denoted here by ${\mathcal{T}}_{p,q}(U)$ is obtained by
taking the tensor product of $p$ copies of $U$, and $q$ copies of
the dual vector space ${U}^{*}$. To wit,

$${\mathcal{T}}_{p,q}(U)=\stackrel{p\text{times}}{\stackrel{\u23de}{U\otimes \mathrm{\dots}\otimes U}}\otimes \stackrel{q\text{times}}{\stackrel{\u23de}{{U}^{*}\otimes \mathrm{\dots}\otimes {U}^{*}}}.$$ |

In order to represent a tensor by a concrete array of numbers, we require a frame of reference, which is essentially a basis of $U$, say ${e}_{1},\mathrm{\dots},{e}_{n}\in U.$ Every vector in $U$ can be “measured” relative to this basis, meaning that for every $\mathbf{v}\in U$ there exist unique scalars ${v}^{i}$, such that (note the use of the Einstein summation convention)

$$\mathbf{v}={v}^{i}{e}_{i}.$$ |

These scalars are called the components^{} of $\mathbf{v}$ relative to the frame
in question.

Let ${\epsilon}^{1},\mathrm{\dots},{\epsilon}^{n}\in {U}^{*}$ be the corresponding dual
basis^{}, i.e.,

$${\epsilon}^{i}({e}_{j})={\delta}_{j}^{i},$$ |

where the latter is the Kronecker delta^{} array. For every covector
$\alpha \in {U}^{*}$ there exists a unique array of components ${\alpha}_{i}$ such
that

$$\alpha ={\alpha}_{i}{\epsilon}^{i}.$$ |

More generally, every tensor $\U0001d5a0\in {\mathcal{T}}_{p,q}(U)$ has a unique description in terms of components. That is to say, there exists a unique array of scalars ${A}_{{j}_{1}\mathrm{\dots}{j}_{p}}^{{i}_{1}\mathrm{\dots}{i}_{q}}$ such that

$$\U0001d5a0={A}_{{j}_{1}\mathrm{\dots}{j}_{p}}^{{i}_{1}\mathrm{\dots}{i}_{q}}{e}_{{i}_{1}}\otimes \mathrm{\dots}\otimes {e}_{{i}_{q}}\otimes {\epsilon}^{{j}_{1}}\otimes \mathrm{\dots}\otimes {\epsilon}^{{j}_{p}}.$$ |

## Transformation rule.

Next, suppose that a change is made to a different frame of
reference, say
${\widehat{e}}_{1},\mathrm{\dots},{\widehat{e}}_{n}\in U.$
Any two frames are uniquely related by
an invertible^{} transition matrix ${X}_{j}^{i}$, having the property that for
all values of $j$ we have

$${\widehat{e}}_{j}={X}_{j}^{i}{e}_{i}.$$ | (1) |

Let $\mathbf{v}\in U$ be a vector, and let ${v}^{i}$ and ${\widehat{v}}^{i}$ denote the corresponding component arrays relative to the two frames. From

$$\mathbf{v}={v}^{i}{e}_{i}={\widehat{v}}^{i}{\widehat{e}}_{i},$$ |

and from (1) we infer that

$${\widehat{v}}^{i}={Y}_{j}^{i}{v}^{j},$$ | (2) |

where ${Y}_{j}^{i}$ is the matrix inverse of ${X}_{j}^{i}$, i.e.,

$${X}_{k}^{i}{Y}_{j}^{k}={\delta}_{j}^{i}.$$ |

Thus, the transformation rule for a vector’s components (2) is contravariant to the transformation rule for the frame of reference (1). It is for this reason that the superscript indices of a vector are called contravariant.

To establish (2), we note that the transformation rule for the dual basis takes the form

$${\widehat{\epsilon}}^{i}={Y}_{j}^{i}{\epsilon}^{j},$$ |

and that

$${v}^{i}={\epsilon}^{i}(\mathbf{v}),$$ |

while

$${\widehat{v}}^{i}={\widehat{\epsilon}}^{i}(\mathbf{v}).$$ |

The transformation rule for covector components is covariant. Let $\alpha \in {U}^{*}$ be a given covector, and let ${\alpha}_{i}$ and ${\widehat{\alpha}}_{i}$ be the corresponding component arrays. Then

$${\widehat{\alpha}}_{j}={X}_{j}^{i}{\alpha}_{i}.$$ |

The above relation^{} is easily established. We need only remark that

$${\alpha}_{i}=\alpha ({e}_{i}),$$ |

and that

$${\widehat{\alpha}}_{j}=\alpha ({\widehat{e}}_{j}),$$ |

and then use (1).

In light of the above discussion, we see that the transformation rule
for a general type^{} $(p,q)$ tensor takes the form

$${\widehat{A}}_{{j}_{1}\mathrm{\dots}{j}_{p}}^{{i}_{1}\mathrm{\dots}{i}_{q}}={X}_{{k}_{1}}^{{i}_{1}}\mathrm{\cdots}{X}_{{k}_{q}}^{{i}_{q}}{Y}_{{j}_{1}}^{{l}_{1}}\mathrm{\cdots}{Y}_{{j}_{p}}^{{l}_{1}}{A}_{{l}_{1}\mathrm{\dots}{l}_{p}}^{{k}_{1}\mathrm{\dots}{k}_{q}}.$$ |

Title | tensor |
---|---|

Canonical name | Tensor |

Date of creation | 2013-03-22 12:47:46 |

Last modified on | 2013-03-22 12:47:46 |

Owner | rmilson (146) |

Last modified by | rmilson (146) |

Numerical id | 15 |

Author | rmilson (146) |

Entry type | Definition |

Classification | msc 15A69 |

Related topic | TensorProduct |

Related topic | TensorArray |

Defines | valence |

Defines | rank |