StatlectThe Digital Textbook
Index > Additional topics in probability theory

Characteristic function

In the lecture entitled Moment generating function, we have explained that the distribution of a random variable can be characterized in terms of its moment generating function, a real function that enjoys two important properties: it uniquely determines its associated probability distribution, and its derivatives at zero are equal to the moments of the random variable. We have also explained that not all random variables possess a moment generating function.

The characteristic function (cf) enjoys properties that are almost identical to those enjoyed by the moment generating function, but it has an important advantage: all random variables possess a characteristic function.

Definition

We start this lecture with a definition of characteristic function.

Definition Let X be a random variable. Let $i=sqrt{-1}$ be the imaginary unit. The function [eq1] defined by[eq2]is called the characteristic function of X.

The first thing to be noted is that [eq3] exists for any $t$. This can be proved as follows:[eq4]and the last two expected values are well-defined, because the sine and cosine functions are bounded in the interval [eq5].

Deriving moments with the characteristic function

Like the moment generating function of a random variable, the characteristic function can be used to derive the moments of X, as stated in the following proposition.

Proposition Let X be a random variable and [eq6] its cf. Let $nin U{2115} $. If the n-th moment of X, denoted by [eq7], exists and is finite, then [eq8] is n times continuously differentiable and[eq9]where [eq10] is the n-th derivative of [eq11] with respect to $t$, evaluated at the point $t=0$.

Proof

The proof of this proposition is quite complex (see, e.g., Resnick, S. I. (1999) A Probability Path, Birkhauser) and we give here only a sketch, without taking technical details into consideration. By virtue of the linearity of the expected value and of the derivative operator, the derivative can be brought inside the expected value, as follows:[eq12]When $t=0$, the latter becomes[eq13]

In practice, the proposition above is not very useful when one wants to compute a moment of a random variable, because the proposition requires to know in advance whether the moment exists or not. A much more useful proposition is the following.

Proposition Let X be a random variable and [eq14] its characteristic function. If [eq15] is n times differentiable at the point $t=0$, then

  1. if n is even, the k-th moment of X exists and is finite for any $kleq n$;

  2. if n is odd, the k-th moment of X exists and is finite for any $k<n$.

In both cases,[eq16]where [eq17] is the k-th derivative of [eq15] with respect to $t$, evaluated at the point $t=0$.

Proof

See, e.g., Ushakov, N. G. (1999) Selected topics in characteristic functions, VSP.

The next example shows how this proposition can be used to compute the second moment of an exponential random variable.

Example Let X be an exponential random variable with parameter [eq19]. Its support is the set of positive real numbers:[eq20]and its probability density function is[eq21]Its cf is[eq22]which is proved in the lecture entitled Exponential distribution. Note that the division [eq23] above does not pose any division-by-zero problem, because the denominator is different from 0 also when $t=0$ (because $lambda >0$). The first derivative of the cf is[eq24]The second derivative of the cf is[eq25]Evaluating it at $t=0$, we obtain[eq26]Therefore, the second moment of X exists and is finite. Furthermore, it can be computed as[eq27]

Characterization of a distribution via the characteristic function

Characteristic functions, like moment generating functions, can also be used to characterize the distribution of a random variable.

Proposition Let X and Y be two random variables. Denote by [eq28] and [eq29] their distribution functions and by [eq15] and [eq31] their cfs. Then, X and Y have the same distribution, i.e., [eq32] for any x, if and only if they have the same cf, i.e., [eq33] for any $t$.

Proof

See, e.g., Resnick, S. I. (1999) A Probability Path, Birkhauser.

In applications, this proposition is often used to prove that two distributions are equal, especially when it is too difficult to directly prove the equality of the two distribution functions [eq34] and [eq29].

More details

Characteristic function of a linear transformation

Let X be a random variable with cf [eq15]. Define[eq37]where $a,bin U{211d} $ are two constants and $b
eq 0$. Then, the cf of Y is[eq38]

Proof

Using the definition of cf, we obtain[eq39]

Characteristic function of a sum of mutually independent random variables

Let X_1, ..., X_n be n mutually independent random variables. Let Z be their sum:[eq40]Then, the cf of Z is the product of the cfs of X_1, ..., X_n:[eq41]

Proof

It can be demonstrated as follows:[eq42]

Computation of the characteristic function

When X is a discrete random variable with support R_X and probability mass function [eq43], its cf is[eq44]Thus, the computation of the characteristic function is pretty straightforward: all we need to do is to sum the complex numbers [eq45] over all values of x belonging to the support of X.

When X is an absolutely continuous random variable with probability density function [eq46], its cf is[eq47]The right-hand side integral is a contour integral of a complex function along the real axis. As people reading these lecture notes are usually not familiar with contour integration (a topic in complex analysis), we avoid it altogether and instead exploit the fact that[eq48]to rewrite the contour integral as the complex sum of two ordinary integrals:[eq49]and to compute the two integrals separately.

Solved exercises

Below you can find some exercises with explained solutions:

  1. Exercise set 1 (derivation and use of characteristic functions).

The book

Most learning materials found on this website are now available in a traditional textbook format.