# random vector

A random vector is a finite-dimensional formal vector of random variables  . The random vector can be written either as a column or row of random variables, depending on its context and use. So if $X_{1},X_{2},\ldots,X_{n}$ are random variables, then

 $\textbf{X}=\begin{pmatrix}X_{1}\\ X_{2}\\ \vdots\\ X_{n}\end{pmatrix}=(X_{1},X_{2},\ldots,X_{n})^{\operatorname{T}}$

is a random (column) vector. Similarly, one defines a random matrix to be a formal matrix whose entries are all random variables. The size of a random vector and the size of a random matrix are assumed to be finite fixed constants.

The distribution  of a random vector $\textbf{X}=(X_{1},X_{2},\ldots,X_{n})$ is defined to be the joint distribution  of its coordinates  $X_{1},\ldots,X_{n}$:

 $F_{\textbf{X}}(\textbf{x}):=F_{X_{1},\ldots,X_{n}}(x_{1},\ldots,x_{n}).$

Similarly, the distribution of a random matrix is the joint distribution of its matrix components.

Let $\textbf{X}=(X_{1},X_{2},\ldots,X_{n})$ be a random vector. If $\operatorname{E}[X_{i}]$ exists ($<\infty$) for each $i$, then the expectation of X, called the mean vector and denoted by $\mathbf{E}[\textbf{X}]$, is defined to be:

 $\mathbf{E}[\textbf{X}]:=(\operatorname{E}[X_{1}],\operatorname{E}[X_{2}],% \ldots,\operatorname{E}[X_{n}]).$

Clearly $\mathbf{E}[\textbf{X}]^{T}=\mathbf{E}[\textbf{X}^{T}]$. The expectation of a random matrix is similarly defined. Note that the definitions of expectations can also be defined via measure theory. Then, using Fubini’s Theorem, one can show that the two sets of definitions coincide.

Again, let $\textbf{X}=(X_{1},X_{2},\ldots,X_{n})^{T}$ be a random vector. If $\boldsymbol{\mu}$=$\mathbf{E}[\textbf{X}]$ is defined and $\operatorname{E}[X_{i}X_{j}]$ are defined for all $1\leq i,j\leq n$, then the variance  of X, denoted by $\textbf{Var}[\textbf{X}]$, is defined to be:

 $\textbf{Var}[\textbf{X}]:=\mathbf{E}\big{[}(\textbf{X}-\boldsymbol{\mu})(% \textbf{X}-\boldsymbol{\mu})^{T}\big{]}.$

It is not hard to see that $\textbf{Var}[\textbf{X}]$ is an $n\times n$ symmetric matrix  and it is equal to the covariance matrix  of the $X_{i}$’s.

:

1. 1.

If X is an $n$-dimensional random vector with A a $m\times n$ constant matrix and $\boldsymbol{\alpha}$ an $m$-dimensional constant vector, then

 $\mathbf{E}[\mathbf{AX}+\boldsymbol{\alpha}]=\mathbf{AE}[\mathbf{X}]+% \boldsymbol{\alpha}.$
2. 2.

Same set up as above. Then

 $\mathbf{Var}[\mathbf{AX}+\boldsymbol{\alpha}]=\mathbf{AVar}[\mathbf{X}]\mathbf% {A}^{T}.$

If the ${X_{i}}$’s are iid (independent identically distributed), with variance $\boldsymbol{\sigma}^{2}$, then

 $\mathbf{Var}[\mathbf{AX}+\boldsymbol{\alpha}]=\boldsymbol{\sigma}^{2}\mathbf{% AA}^{T}.$
3. 3.

Let $\mathbf{X}$ be an $n$-dimensional random vector with $\boldsymbol{\mu}=\mathbf{E[X]}$, $\boldsymbol{\Sigma}=\mathbf{Var[X]}$. $\mathbf{A}$ is an $n\times n$ constant matrix. Then

 $\mathbf{E}[\mathbf{X}^{T}\mathbf{AX}]=\operatorname{tr}(\mathbf{A}\boldsymbol{% \Sigma})+\boldsymbol{\mu}^{T}\mathbf{A}\boldsymbol{\mu}.$
Title random vector RandomVector 2013-03-22 14:27:20 2013-03-22 14:27:20 CWoo (3771) CWoo (3771) 15 CWoo (3771) Definition msc 62H99 msc 15A52 random matrix distribution of a random vector distribution of a random matrix mean vector