Gram matrix
For a vector space^{} $V$ of dimension^{} $n$ over a field $k$, endowed with an inner product $$, and for any given sequence^{} of elements^{} ${x}_{1},\mathrm{\dots},{x}_{l}\in V$, consider the following inclusion map^{} $\iota $ associated to the ${({x}_{i})}_{i=1,\mathrm{\dots},l}$:
$$\begin{array}{cccc}\hfill \iota :\hfill & \hfill {k}^{l}\hfill & \hfill \to \hfill & \hfill {k}^{n}=V\hfill \\ & \hfill ({\lambda}_{1},\mathrm{\dots},{\lambda}_{l})\hfill & \hfill \mapsto \hfill & \hfill \sum _{i=1}^{l}{\lambda}_{i}{x}_{i}\hfill \end{array}$$ 
The Gram bilinear form of the ${({x}_{i})}_{i=1,\mathrm{\dots},l}$ is the function
$$ 
The Gram matrix of the ${({x}_{i})}_{i=1,\mathrm{\dots},l}$ is the matrix associated to the Gram bilinear form in the canonical basis of ${k}^{l}$. The Gram form (resp. matrix) is a symmetric bilinear form^{} (resp. matrix).
Gram forms/matrices are usually considered with $k=\mathbb{R}$ and $$ the usual scalar products^{} on vector spaces over $\mathbb{R}$. In that context they have a strong geometrical meaning:

•
The determinant^{} of the Gram form/matrix is $0$ iff $\iota $ is an injection.

•
If $\iota $ is injective, the Gram matrix (resp. form) is a positive^{} symmetric matrix^{} (resp. bilinear form^{}).

•
$det{(G)}^{2}=\mathrm{Vol}({\iota}^{1}({B}_{0,1}))$ where $G$ is the gram form/matrix, $\mathrm{Vol}$ denotes the volume of a subset of ${\mathbb{R}}^{n}$, and ${B}_{0,1}$ is the unit ball of ${\mathbb{R}}^{n}$ centered at $0$.

•
Let $\mathrm{\Delta}:\lambda \in {\mathbb{R}}^{l}\mapsto G(\lambda ,\lambda )$, where $G$ is the Gram bilinear form, then ${\iota}^{1}({B}_{0,1})={\mathrm{\Delta}}^{1}([0,1])$.

•
If $\iota $ is injective and $s:\mathrm{Im}(\iota )\subset {\mathbb{R}}^{n}\to {\mathbb{R}}^{l}$ is an isometry, then $det{(s\circ \iota )}^{2}=det(G)$.

•
Let $f$ be an endomorphism of ${\mathbb{R}}^{n}$ and $M$ its matrix. Let $H,U$ be the polar decomposition of $M$, $H$ is a symmetric^{} positive matrix and $U$ an orthogonal matrix^{}. Let the ${x}_{i}$ be the columns of $M$ ($l=n$) and let $G$ be the Gram matrix the ${x}_{i}$. Then ${H}^{2}=G$. (N.B.: this is one way to prove the existence of the polar decomposition, take the square root of the Gram matrix, multiply $M$ by its inverse^{} and it easily follows that what is obtained is an orthogonal matrix).
They are utilized in statistics in Principal components analysis. One wants to determine the general trend in terms of few characteristics ($l$ of them) in a large sample ($n$ individuals). Each ${x}_{i}\phantom{\rule{veryverythickmathspace}{0ex}}(\in {\mathbb{R}}^{n})$ represents the results of the $n$ individuals in the sample for the ${i}^{\mathrm{th}}$ characteristic. Each one of the $l$ dimensions represents a characteristic and one wants to know what are the predominant characteristics and if they bear some kind of linear relations^{} between them. This is achieved by diagonalizing the Gram matrix (often called dispersion matrix or covariance matrix in that context). The higher the eigenvalue^{}, the more important the eigenvector^{} associated to it.
Title  Gram matrix 

Canonical name  GramMatrix 
Date of creation  20130322 17:56:55 
Last modified on  20130322 17:56:55 
Owner  lalberti (18937) 
Last modified by  lalberti (18937) 
Numerical id  5 
Author  lalberti (18937) 
Entry type  Definition 
Classification  msc 15A63 
Synonym  Gram matrices 
Synonym  Gram bilinear form 