# characteristic function

Let $X$ be a random variable. The characteristic function of $X$ is a function $\varphi_{X}:\mathbb{R}\rightarrow\mathbb{C}$ defined by

 $\varphi_{X}(t)=Ee^{itX}=E\cos(tX)+iE\sin(tX),$

that is, $\varphi_{X}(t)$ is the expectation of the random variable $e^{itX}$.

Given a random vector $\overline{X}=(X_{1},\dots,X_{n})$, the characteristic function of $\overline{X}$, also called joint characteristic function of $X_{1},\dots,X_{n}$, is a function $\varphi_{\overline{X}}:\mathbb{R}^{n}\rightarrow\mathbb{C}$ defined by

 $\varphi_{\overline{X}}(t)=Ee^{i{\overline{t}}\cdot{\overline{X}}},$

where $\overline{t}=(t_{1},\dots,t_{n})$ and ${\overline{t}}\cdot{\overline{X}}=t_{1}X_{1}+\cdots+t_{n}X_{n}$ (the dot product.)

Remark. If $F_{X}$ is the distribution function associated to $X$, by the properties of expectation we have

 $\varphi_{X}(t)=\int_{\mathbb{R}}e^{itx}dF_{X}(x),$

which is known as the Fourier-Stieltjes transform of $F_{X}$, and provides an alternate definition of the characteristic function. From this, it is clear that the characteristic function depends only on the distribution function of $X$, hence one can define the characteristic function associated to a distribution even when there is no random variable involved. This implies that two random variables with the same distribution must have the same characteristic function. It is also true that each characteristic function determines a unique distribution; hence the , since it characterizes the distribution function (see property 6.)

Properties

1. 1.

The characteristic function is bounded by $1$, i.e. $|\varphi_{X}(t)|\leq 1$ for all $t$;

2. 2.

$\varphi_{X}(0)=1$;

3. 3.

$\overline{\varphi_{X}(t)}=\varphi_{X}(-t)$, where $\overline{z}$ denotes the complex conjugate of $z$;

4. 4.

$\varphi_{X}$ is uniformly continuous in $\mathbb{R}$;

5. 5.

If $X$ and $Y$ are independent random variables, then $\varphi_{X+Y}=\varphi_{X}\varphi_{Y}$;

6. 6.

The characteristic function determines the distribution function; hence, $\varphi_{X}=\varphi_{Y}$ if and only if $F_{X}=F_{Y}$. This is a consequence of the inversion : Given a random variable $X$ with characteristic function $\varphi$ and distribution function $F$, if $x$ and $y$ are continuity points of $F$ such that $x, then

 $F(x)-F(y)=\frac{1}{2\pi}\lim_{s\rightarrow\infty}\int_{-s}^{s}\frac{e^{-itx}-e% ^{-ity}}{it}\varphi(t)dt;$
7. 7.

A random variable $X$ has a symmetrical distribution (i.e. one such that $F_{X}=F_{-X}$) if and only if $\varphi_{X}(t)\in\mathbb{R}$ for all $t\in\mathbb{R}$;

8. 8.

For real numbers $a,b$, $\varphi_{aX+b}(t)=e^{itb}\varphi_{X}(at)$;

9. 9.

If $E|X|^{n}<\infty$, then $\varphi_{X}$ has continuous $n$-th derivatives and

 $\frac{d^{k}\varphi_{X}}{dt^{k}}(t)=\varphi_{X}^{(k)}(t)=\int_{\mathbb{R}}(ix)^% {k}e^{itx}dF_{X}(x),\;\;1\leq k\leq n.$

Particularly, $\varphi_{X}^{(k)}(0)=i^{k}EX^{k}$; characteristic functions are similar to moment generating functions in this sense.

Similar properties hold for joint characteristic functions. Other important result related to characteristic functions is the Paul Lévy continuity theorem.

Title characteristic function CharacteristicFunction1 2013-03-22 13:14:28 2013-03-22 13:14:28 Koro (127) Koro (127) 5 Koro (127) Definition msc 60E10 joint characteristic function MomentGeneratingFunction CumulantGeneratingFunction