StatlectThe Digital Textbook
Index > Additional topics in probability theory

Moment generating function

The distribution of a random variable is often characterized in terms of its moment generating function (mgf), a real function whose derivatives at zero are equal to the moments of the random variable. Moment generating functions have great practical relevance not only because they can be used to easily derive moments, but also because a probability distribution is uniquely determined by its mgf, a fact that, coupled with the analytical tractability of mgfs, makes them a handy tool to solve several problems, such as deriving the distribution of a sum of two or more random variables.

It must be mentioned that not all random variables possess a moment generating function. However, all random variables possess a characteristic function, another transform that enjoys properties similar to those enjoyed by the mgf.


The following is a formal definition.

Definition Let X be a random variable. If the expected value [eq1] exists and is finite for all real numbers $t$ belonging to a closed interval [eq2], with $h>0$, then we say that X possesses a moment generating function and the function[eq3]is called the moment generating function of X.

The next example shows how the mgf of an exponential random variable is calculated.

Example Let X be a continuous random variable with support[eq4]and probability density function[eq5]where $lambda $ is a strictly positive number. The expected value [eq6] can be computed as follows:[eq7]Furthermore, the above expected value exists and is finite for any [eq8], provided [eq9]. As a consequence, X possesses a mgf:[eq10]

Deriving moments with the mgf

The moment generating function takes its name by the fact that it can be used to derive the moments of X, as stated in the following proposition.

Proposition If a random variable X possesses a mgf [eq11], then the $n $-th moment of X, denoted by [eq12], exists and is finite for any $nin U{2115} $. Furthermore:[eq13]where [eq14] is the n-th derivative of [eq15] with respect to $t$, evaluated at the point $t=0$.


Proving the above proposition is quite complicated, because a lot of analytical details must be taken care of (see e.g. Pfeiffer - 2012). The intuition, however, is straightforward. Since the expected value is a linear operator and differentiation is a linear operation, under appropriate conditions we can differentiate through the expected value:[eq16]Making the substitution $t=0$, we obtain:[eq17]

The next example shows how this proposition can be applied.

Example In the previous example we have demonstrated that the mgf of an exponential random variable is:[eq18]The expected value of X can be computed by taking the first derivative of the mgf:[eq19]and evaluating it at $t=0$:[eq20]The second moment of X can be computed by taking the second derivative of the mgf:[eq21]and evaluating it at $t=0$:[eq22]And so on for higher moments.

Characterization of a distribution via the moment generating function

The most important property of the mgf is the following:

Proposition (Equality of distributions) Let X and Y be two random variables. Denote by [eq23] and [eq24] their distribution functions and by [eq25] and [eq26] their mgfs. X and Y have the same distribution (i.e. [eq27] for any x) if and only if they have the same mgfs (i.e. [eq28] for any $t$).


For a fully general proof of this proposition see e.g. Feller (2008). We just give an informal proof for the special case in which X and Y are discrete random variables taking only finitely many values. The "only if" part is trivial. If X and Y have the same distribution, then[eq29]The "if" part is proved as follows. Denote by R_X and $R_{Y}$ the supports of X and Y and by [eq30] and [eq31] their probability mass functions. Denote by A the union of the two supports:[eq32]and by [eq33] the elements of A. The mgf of X can be written as:[eq34]By the same token, the mgf of Y can be written as:[eq35]If X and Y have the same mgf, then for any $t$ belonging to a closed neighborhood of zero[eq36]and[eq37]Rearranging terms, we obtain:[eq38]This can be true for any $t$ belonging to a closed neighborhood of zero only if[eq39]for every i. It follows that that the probability mass functions of X and Y are equal. As a consequence, also their distribution functions are equal.

It must be stressed that this proposition is extremely important and relevant from a practical viewpoint: in many cases where we need to prove that two distributions are equal, it is much easier to prove equality of the moment generating functions than to prove equality of the distribution functions. Also note that equality of the distribution functions can be replaced in the proposition above by equality of the probability mass functions (if X and Y are discrete random variables) or by equality of the probability density functions (if X and Y are absolutely continuous random variables).

More details

Moment generating function of a linear transformation

Let X be a random variable possessing a mgf [eq25]. Define:[eq41]where $a,bin U{211d} $ are two constants and $b
eq 0$. Then, the random variable Y possesses a mgf [eq26] and:[eq43]


Using the definition of mgf:[eq44]Obviously, if [eq25] is defined on a closed interval [eq46], then [eq26] is defined on the interval [eq48].

Moment generating function of a sum of mutually independent random variables

Let X_1, ..., X_n be n mutually independent random variables. Let Z be their sum:[eq49]Then, the mgf of Z is the product of the mgfs of X_1, ..., X_n:[eq50]


This is easily proved using the definition of mgf and the properties of mutually independent variables:[eq51]

Solved exercises

Some solved exercises on moment generating functions can be found below:

  1. Exercise set 1


Feller, W. (2008) An introduction to probability theory and its applications, Volume 2, Wiley.

Pfeiffer, P. E. (1978) Concepts of probability theory, Dover Publications.

The book

Most learning materials found on this website are now available in a traditional textbook format.