Search for probability and statistics terms on Statlect
StatLect

Moment generating function

by , PhD

The moment generating function (mgf) is a function often used to characterize the distribution of a random variable.

Table of Contents

How it is used

The moment generating function has great practical relevance because:

  1. it can be used to easily derive moments; its derivatives at zero are equal to the moments of the random variable;

  2. a probability distribution is uniquely determined by its mgf.

Fact 2, coupled with the analytical tractability of mgfs, makes them a handy tool for solving several problems, such as deriving the distribution of a sum of two or more random variables.

Formula for the moment generating function and relation between moments and the derivatives of the mgf.

Definition

The following is a formal definition.

Definition Let X be a random variable. If the expected value [eq1] exists and is finite for all real numbers $t$ belonging to a closed interval [eq2], with $h>0$, then we say that X possesses a moment generating function and the function[eq3]is called the moment generating function of X.

Not all random variables possess a moment generating function. However, all random variables possess a characteristic function, another transform that enjoys properties similar to those enjoyed by the mgf.

Example

The next example shows how the mgf of an exponential random variable is calculated.

Example Let X be a continuous random variable with support[eq4]and probability density function[eq5]where $lambda $ is a strictly positive number. The expected value [eq6] can be computed as follows:[eq7]Furthermore, the above expected value exists and is finite for any [eq8], provided [eq9]. As a consequence, X possesses a mgf:[eq10]

Deriving moments with the mgf

The moment generating function takes its name by the fact that it can be used to derive the moments of X, as stated in the following proposition.

Proposition If a random variable X possesses a mgf [eq11], then the $n $-th moment of X, denoted by [eq12], exists and is finite for any $nin U{2115} $. Furthermore,[eq13]where [eq14] is the n-th derivative of [eq15] with respect to $t$, evaluated at the point $t=0$.

Proof

Proving the above proposition is quite complicated, because a lot of analytical details must be taken care of (see e.g. Pfeiffer - 2012). The intuition, however, is straightforward. Since the expected value is a linear operator and differentiation is a linear operation, under appropriate conditions we can differentiate through the expected value:[eq16]Making the substitution $t=0$, we obtain[eq17]

The next example shows how this proposition can be applied.

Example In the previous example we have demonstrated that the mgf of an exponential random variable is[eq18]The expected value of X can be computed by taking the first derivative of the mgf:[eq19]and evaluating it at $t=0$:[eq20]The second moment of X can be computed by taking the second derivative of the mgf:[eq21]and evaluating it at $t=0$:[eq22]And so on for higher moments.

Characterization of a distribution via the moment generating function

The most important property of the mgf is the following.

Proposition Let X and Y be two random variables. Denote by [eq23] and [eq24] their distribution functions and by [eq25] and [eq26] their mgfs. X and Y have the same distribution (i.e., [eq27] for any x) if and only if they have the same mgfs (i.e., [eq28] for any $t$).

Proof

For a fully general proof of this proposition see, for example, Feller (2008). We just give an informal proof for the special case in which X and Y are discrete random variables taking only finitely many values. The "only if" part is trivial. If X and Y have the same distribution, then[eq29]The "if" part is proved as follows. Denote by R_X and $R_{Y}$ the supports of X and Y and by [eq30] and [eq31] their probability mass functions. Denote by A the union of the two supports:[eq32]and by [eq33] the elements of A. The mgf of X can be written as[eq34]By the same token, the mgf of Y can be written as:[eq35]If X and Y have the same mgf, then for any $t$ belonging to a closed neighborhood of zero[eq36]and[eq37]Rearranging terms, we obtain[eq38]This can be true for any $t$ belonging to a closed neighborhood of zero only if[eq39]for every i. It follows that that the probability mass functions of X and Y are equal. As a consequence, also their distribution functions are equal.

This proposition is extremely important and relevant from a practical viewpoint: in many cases where we need to prove that two distributions are equal, it is much easier to prove equality of the moment generating functions than to prove equality of the distribution functions.

Also note that equality of the distribution functions can be replaced in the proposition above by:

More details

The following sections contain more details about the mgf.

Moment generating function of a linear transformation

Let X be a random variable possessing a mgf [eq25].

Define[eq41]where $a,bin U{211d} $ are two constants and $b
eq 0$.

Then, the random variable Y possesses a mgf [eq26] and[eq43]

Proof

By the very definition of mgf, we have[eq44]Obviously, if [eq25] is defined on a closed interval [eq46], then [eq26] is defined on the interval [eq48].

Moment generating function of a sum of mutually independent random variables

Let X_1, ..., X_n be n mutually independent random variables.

Let Z be their sum:[eq49]

Then, the mgf of Z is the product of the mgfs of X_1, ..., X_n:[eq50]

Proof

This is easily proved by using the definition of mgf and the properties of mutually independent variables:[eq51]

Multivariate generalization

The multivariate generalization of the mgf is discussed in the lecture on the joint moment generating function.

Solved exercises

Some solved exercises on moment generating functions can be found below.

Exercise 1

Let X be a discrete random variable having a Bernoulli distribution.

Its support R_X is[eq52]and its probability mass function [eq53] is[eq54]where [eq55] is a constant.

Derive the moment generating function of X, if it exists.

Solution

By the definition of moment generating function, we have[eq56]Obviously, the moment generating function exists and it is well-defined because the above expected value exists for any t in R.

Exercise 2

Let X be a random variable with moment generating function[eq57]

Derive the variance of X.

Solution

We can use the following formula for computing the variance:[eq58]The expected value of X is computed by taking the first derivative of the moment generating function:[eq59]and evaluating it at $t=0$:[eq60]The second moment of X is computed by taking the second derivative of the moment generating function:[eq61]and evaluating it at $t=0$:[eq62]Therefore,[eq63]

Exercise 3

A random variable X is said to have a Chi-square distribution with n degrees of freedom if its moment generating function is defined for any $frac{1}{2}$ and it is equal to[eq64]

Define [eq65]where X_1 and X_2 are two independent random variables having Chi-square distributions with $n_{1}$ and $n_{2}$ degrees of freedom respectively.

Prove that Y has a Chi-square distribution with $n_{1}+n_{2}$ degrees of freedom.

Solution

The moment generating functions of X_1 and X_2 are[eq66]The moment generating function of a sum of independent random variables is just the product of their moment generating functions:[eq67]Therefore, [eq26] is the moment generating function of a Chi-square random variable with $n_{1}+n_{2}$ degrees of freedom. As a consequence, Y has a Chi-square distribution with $n_{1}+n_{2}$ degrees of freedom.

References

Feller, W. (2008) An introduction to probability theory and its applications, Volume 2, Wiley.

Pfeiffer, P. E. (1978) Concepts of probability theory, Dover Publications.

How to cite

Please cite as:

Taboga, Marco (2021). "Moment generating function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/moment-generating-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.