that is, is the expectation of the random variable .
Given a random vector , the characteristic function of , also called joint characteristic function of , is a function defined by
where and (the dot product.)
Remark. If is the distribution function associated to , by the properties of expectation we have
which is known as the Fourier-Stieltjes transform of , and provides an alternate definition of the characteristic function. From this, it is clear that the characteristic function depends only on the distribution function of , hence one can define the characteristic function associated to a distribution even when there is no random variable involved. This implies that two random variables with the same distribution must have the same characteristic function. It is also true that each characteristic function determines a unique distribution; hence the , since it characterizes the distribution function (see property 6.)
The characteristic function is bounded by , i.e. for all ;
, where denotes the complex conjugate of ;
is uniformly continuous in ;
If and are independent random variables, then ;
A random variable has a symmetrical distribution (i.e. one such that ) if and only if for all ;
For real numbers , ;
Similar properties hold for joint characteristic functions. Other important result related to characteristic functions is the Paul Lévy continuity theorem.
|Date of creation||2013-03-22 13:14:28|
|Last modified on||2013-03-22 13:14:28|
|Last modified by||Koro (127)|
|Synonym||joint characteristic function|