Search for probability and statistics terms on Statlect
StatLect

Joint moment generating function

by , PhD

The joint moment generating function (joint mgf) is a multivariate generalization of the moment generating function.

Similarly to the univariate case, a joint mgf uniquely determines the joint distribution of its associated random vector, and it can be used to derive the cross-moments of the distribution by partial differentiation.

If you are not familiar with the univariate concept, you are advised to first read the lecture on moment generating functions.

Table of Contents

Definition

Let us start with a formal definition.

Definition Let X be a Kx1 random vector. If the expected value[eq1]exists and is finite for all Kx1 real vectors $t$ belonging to a closed rectangle H:[eq2]with $h_{i}>0$ for all $i=1,ldots ,K$, then we say that X possesses a joint moment generating function and the function [eq3] defined by[eq4]is called the joint moment generating function of X.

Not all random vectors possess a joint mgf. However, all random vectors possess a joint characteristic function, a transform that enjoys properties similar to those enjoyed by the joint mgf.

Example

As an example, we derive the joint mgf of a standard multivariate normal random vector.

Example Let X be a Kx1 standard multivariate normal random vector. Its support R_X is[eq5]and its joint probability density function [eq6] is[eq7]As explained in the lecture entitled Multivariate normal distribution, the K components of X are K mutually independent standard normal random variables, because the joint probability density function of X can be written as[eq8]where $x_{i}$ is the i-th entry of x and [eq9] is the probability density function of a standard normal random variable:[eq10]Therefore, the joint mgf of X can be derived as follows:[eq11]Since the mgf of a standard normal random variable is[eq12]then[eq13][eq14] is defined for any $t_{i}in U{211d} $. As a consequence, [eq15] is defined for any $tin U{211d} ^{K}$.

Relation to cross-moments

The next proposition shows how the joint mgf can be used to derive the cross-moments of a random vector.

Proposition If a Kx1 random vector X possesses a joint mgf [eq16], then X possesses finite cross-moments of order n, for any $nin U{2115} $. Furthermore, if you define a cross-moment of order n as[eq17]where [eq18] and [eq19], then[eq20]where the derivative on the right-hand side is the n-th order partial derivative of [eq21] evaluated at the point [eq22].

Proof

We do not provide a rigorous proof of this proposition, but see, e.g., Pfeiffer (1978) and DasGupta (2010). The main intuition, however, is quite simple. Differentiation is a linear operation and the expected value is a linear operator. This allows us to differentiate through the expected value, provided appropriate technical conditions (omitted here) are satisfied:[eq23]Evaluating this derivative at the point [eq24], we obtain[eq25]

The following example shows how this proposition can be applied.

Example Let's continue with the previous example. The joint mgf of a $2	imes 1$ standard normal random vector X is[eq26]The second cross-moment of X can be computed by taking the second cross partial derivative of [eq27]:[eq28]

Characterization of a joint distribution

One of the most important properties of the joint mgf is that it completely characterizes the joint distribution of a random vector.

Proposition Let X and Y be two Kx1 random vectors, possessing joint mgfs [eq29] and [eq30]. Denote by [eq31] and [eq32] their joint distribution functions. X and Y have the same joint distribution if and only if they have the same joint mgfs:[eq33]

Proof

The reader may refer to Feller (2008) for a rigorous proof. The informal proof given here is almost identical to that given for the univariate case. We confine our attention to the case in which X and Y are discrete random vectors taking only finitely many values. As far as the left-to-right direction of the implication is concerned, it suffices to note that if X and Y have the same distribution, then[eq34]The right-to-left direction of the implication is proved as follows. Denote by R_X and $R_{Y}$ the supports of X and Y and by [eq35] and [eq36] their joint probability mass functions. Define the union of the two supports:[eq37]and denote its members by [eq38]. The joint mgf of X can be written as[eq39]By the same line of reasoning, the joint mgf of Y can be written as[eq40]If X and Y have the same joint mgf, then[eq41]for any $t$ belonging to a closed rectangle where the two mgfs are well-defined, and[eq42]Rearranging terms, we obtain[eq43]This equality can be verified for every $t$ only if[eq44]for every i. As a consequence, the joint probability mass functions of X and Y are equal, which implies that also their joint distribution functions are equal.

This proposition is used very often in applications where one needs to demonstrate that two joint distributions are equal. In such applications, proving equality of the joint moment generating functions is often much easier than proving equality of the joint distribution functions.

More details

The following sections contain more details about the joint mgf.

Joint moment generating function of a linear transformation

Let X be a Kx1 random vector possessing joint mgf [eq45].

Define[eq46]where $A $is a $L	imes 1$ constant vector and and $B $is an $L	imes K$ constant matrix.

Then, the $L	imes 1$ random vector Y possesses a joint mgf [eq47] and[eq48]

Proof

Using the definition of mgf, we get[eq49]If [eq50] is defined on a closed rectangle H, then [eq51] is defined on another closed rectangle whose shape and location depend on A and $B$.

Joint moment generating function of a random vector with independent entries

Let X be a Kx1 random vector.

Let its entries X_1, ..., $X_{K}$ be K mutually independent random variables possessing a mgf.

Denote the mgf of the i-th entry of X by [eq14].

Then, the joint mgf of X is[eq53]

Proof

This fact is demonstrated as follows:[eq54]

Joint mgf of a sum of mutually independent random vectors

Let X_1, ..., X_n be n mutually independent random vectors, all of dimension Kx1.

Let Z be their sum:[eq55]

Then, the joint mgf of Z is the product of the joint mgfs of X_1, ..., X_n:[eq56]

Proof

This fact descends from the properties of mutually independent random vectors and from the definition of joint mgf:[eq57]

Solved exercises

Some solved exercises on joint moment generating functions can be found below.

Exercise 1

Let X be a $2	imes 1$ discrete random vector and denote its components by X_1 and X_2.

Let the support of X be [eq58]and its joint probability mass function be[eq59]

Derive the joint moment generating function of X, if it exists.

Solution

By the definition of moment generating function, we have[eq60]Obviously, the joint moment generating function exists and it is well-defined because the above expected value exists for any $tin U{211d} ^{2}$.

Exercise 2

Let [eq61]be a $2	imes 1$ random vector with joint moment generating function[eq62]

Derive the expected value of X_1.

Solution

The moment generating function of X_1 is[eq63]The expected value of X_1 is obtained by taking the first derivative of its moment generating function:[eq64]and evaluating it at $t_{1}=0$:[eq65]

Exercise 3

Let [eq66]be a $2	imes 1$ random vector with joint moment generating function[eq67]

Derive the covariance between X_1 and X_2.

Solution

We can use the following covariance formula:[eq68]The moment generating function of X_1 is[eq69]The expected value of X_1 is obtained by taking the first derivative of its moment generating function:[eq70]and evaluating it at $t_{1}=0$:[eq71]The moment generating function of X_2 is[eq72]To compute the expected value of X_2 we take the first derivative of its moment generating function:[eq73]and evaluating it at $t_{2}=0$:[eq74]The second cross-moment of X is computed by taking the second cross-partial derivative of the joint moment generating function:[eq75]and evaluating it at [eq76]:[eq77]Therefore,[eq78]

References

DasGupta, A. (2010) Fundamentals of probability: a first course, Springer.

Feller, W. (2008) An introduction to probability theory and its applications, Volume 2, Wiley.

Pfeiffer, P. E. (1978) Concepts of probability theory, Dover Publications.

How to cite

Please cite as:

Taboga, Marco (2021). "Joint moment generating function", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/joint-moment-generating-function.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.