Search for probability and statistics terms on Statlect
StatLect

Characteristic function

by , PhD

The characteristic function (cf) is a complex function that completely characterizes the distribution of a random variable.

Table of Contents

How it is used

The use of the characteristic function is almost identical to that of the moment generating function:

  1. it can be used to easily derive the moments of a random variable;

  2. it uniquely determines its associated probability distribution; it is often used to prove that two distributions are equal.

The cf has an important advantage over the moment generating function: while some random variables do not possess the latter, all random variables have a characteristic function.

Definition

We start this lecture with a definition of characteristic function.

Definition Let X be a random variable. Let $i=sqrt{-1}$ be the imaginary unit. The function [eq1] defined by[eq2]is called the characteristic function of X.

The first thing to be noted is that [eq3] exists for any $t$. This can be proved as follows:[eq4]and the last two expected values are well-defined, because the sine and cosine functions are bounded in the interval [eq5].

Deriving moments with the characteristic function

Like the moment generating function of a random variable, the characteristic function can be used to derive the moments of X, as stated in the following proposition.

Proposition Let X be a random variable and [eq6] its cf. Let $nin U{2115} $. If the n-th moment of X, denoted by [eq7], exists and is finite, then [eq8] is n times continuously differentiable and[eq9]where [eq10] is the n-th derivative of [eq11] with respect to $t$, evaluated at the point $t=0$.

Proof

The proof of this proposition is quite complex (see, e.g., Resnick 2013) and we give here only a sketch, without taking technical details into consideration. By virtue of the linearity of the expected value and of the derivative operator, the derivative can be brought inside the expected value, as follows:[eq12]When $t=0$, the latter becomes[eq13]

In practice, the proposition above is not very useful when one wants to compute a moment because it requires to know in advance whether the moment exists or not.

A much more useful proposition is the following.

Proposition Let X be a random variable and [eq14] its characteristic function. If [eq15] is n times differentiable at the point $t=0$, then

  1. if n is even, the k-th moment of X exists and is finite for any $kleq n$;

  2. if n is odd, the k-th moment of X exists and is finite for any $k<n$.

In both cases,[eq16]where [eq17] is the k-th derivative of [eq15] with respect to $t$, evaluated at the point $t=0$.

Proof

See, e.g., Ushakov (1999).

The next example shows how this proposition can be used to compute the second moment of an exponential random variable.

Example Let X be an exponential random variable with parameter [eq19]. Its support is the set of positive real numbers:[eq20]and its probability density function is[eq21]Its cf is[eq22]which is proved in the lecture entitled Exponential distribution. Note that the division [eq23] above does not pose any division-by-zero problem, because the denominator is different from 0 also when $t=0$ (because $lambda >0$). The first derivative of the cf is[eq24]The second derivative of the cf is[eq25]Evaluating it at $t=0$, we obtain[eq26]Therefore, the second moment of X exists and is finite. Furthermore, it can be computed as[eq27]

Characterization of a distribution via the characteristic function

Characteristic functions, like moment generating functions, can also be used to characterize the distribution of a random variable.

Proposition Let X and Y be two random variables. Denote by [eq28] and [eq29] their distribution functions and by [eq15] and [eq31] their cfs. Then, X and Y have the same distribution, i.e., [eq32] for any x, if and only if they have the same cf, i.e., [eq33] for any $t$.

Proof

See, e.g., Resnick 2013.

In applications, this proposition is often used to prove that two distributions are equal, especially when it is too difficult to directly prove the equality of the two distribution functions [eq34] and [eq29].

More details

The following sections contain more details about the characteristic function.

Characteristic function of a linear transformation

Let X be a random variable with cf [eq15].

Define[eq37]where $a,bin U{211d} $ are two constants and $b
eq 0$.

Then, the cf of Y is[eq38]

Proof

Using the definition of cf, we obtain[eq39]

Characteristic function of a sum of mutually independent random variables

Let X_1, ..., X_n be n mutually independent random variables.

Let Z be their sum:[eq40]

Then, the cf of Z is the product of the cfs of X_1, ..., X_n:[eq41]

Proof

It can be demonstrated as follows:[eq42]

Computation of the characteristic function

When X is a discrete random variable with support R_X and probability mass function [eq43], its cf is[eq44]

Thus, the computation of the characteristic function is pretty straightforward: all we need to do is to sum the complex numbers [eq45] over all values of x belonging to the support of X.

When X is a continuous random variable with probability density function [eq46], its cf is[eq47]

The right-hand side integral is a contour integral of a complex function along the real axis.

As people reading these lecture notes are usually not familiar with contour integration (a topic in complex analysis), we avoid it altogether and instead exploit the fact that[eq48]to rewrite the contour integral as the complex sum of two ordinary integrals:[eq49]and to compute the two integrals separately.

Multivariate generalization

The multivariate generalization of the cf is presented in the lecture on the joint characteristic function.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let X be a discrete random variable having support[eq50]and probability mass function[eq51]

Derive the characteristic function of X.

Solution

By using the definition of characteristic function, we get[eq52]

Exercise 2

Use the characteristic function found in the previous exercise to derive the variance of X.

Solution

We can use the following formula for computing the variance:[eq53]The expected value of X is computed by taking the first derivative of the characteristic function:[eq54]evaluating it at $t=0$ and dividing it by i:[eq55]The second moment of X is computed by taking the second derivative of the characteristic function:[eq56]evaluating it at $t=0$ and dividing it by $i^{2}$:[eq57]Therefore,[eq58]

Exercise 3

Read and try to understand how the characteristic functions of the uniform and of the exponential distributions are derived in the lectures entitled Uniform distribution and Exponential distribution.

References

Resnick, S. I. (2013) A Probability Path, Birkhauser.

Ushakov, N. G. (1999) Selected topics in characteristic functions, VSP.

How to cite

Please cite as:

Taboga, Marco (2021). "Characteristic function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/characteristic-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.