Search for probability and statistics terms on Statlect
StatLect

Multivariate normal distribution

by , PhD

The multivariate normal (MV-N) distribution is a multivariate continuous distribution that generalizes the one-dimensional normal distribution.

Table of Contents

How the distribution is obtained

In its simplest form, which is called the "standard" MV-N distribution, it describes the joint distribution of a random vector whose entries are mutually independent univariate normal random variables, all having zero mean and unit variance.

In its general form, it describes the joint distribution of a random vector that can be represented as a linear transformation of a standard MV-N vector.

The remainder of this lecture illustrates the main characteristics of the multivariate normal distribution, dealing first with the "standard" case and then with the more general case.

Common misconception

It is a common mistake to think that any set of normal random variables, when considered together, form a multivariate normal distribution. This is not the case.

In fact, it is possible to construct random vectors that are not MV-N, but whose individual elements have normal distributions.

The latter fact is very well-known in the theory of Copulae (a theory which allows us to specify the distribution of a random vector by first specifying the distribution of its components and then linking the univariate distributions through a function called copula).

The standard multivariate normal distribution

The adjective "standard" is used to indicate that the mean of the distribution is equal to zero and its covariance matrix is equal to the identity matrix.

Definition

Standard MV-N random vectors are characterized as follows.

Definition Let X be a Kx1 continuous random vector. Let its support be the set of K-dimensional real vectors:[eq1]We say that X has a standard multivariate normal distribution if its joint probability density function is[eq2]

Relation to the univariate normal distribution

Denote the i-th component of x by $x_{i}$. The joint probability density function can be written as[eq3]where [eq4] is the probability density function of a standard normal random variable:[eq5]

Therefore, the K components of X are K mutually independent standard normal random variables (a more detailed proof follows).

Proof

As we have seen, the joint probability density function can be written as[eq6]where [eq4] is the probability density function of a standard normal random variable:[eq5]But [eq4] is also the marginal probability density function of the i-th component of X:[eq10]Therefore, the joint probability density function of X is equal to the product of its marginals, which implies that the components of X are mutually independent.

Expected value

The expected value of a standard MV-N random vector X is[eq11]

Proof

All the components of X are standard normal random variables and a standard normal random variable has mean 0.

Covariance matrix

The covariance matrix of a standard MV-N random vector X is[eq12]where I is the $K	imes K$ identity matrix, i.e. a $K	imes K$ matrix whose diagonal elements are equal to 1 and whose off-diagonal entries are equal to 0.

Proof

This is proved using the structure of the covariance matrix:[eq13]where X_i is the i-th component of X. Since the components of X are all standard normal random variables, their variances are all equal to $1 $, i.e.,[eq14]Furthermore, since the components of X are mutually independent and independence implies zero-covariance, all the covariances are equal to 0, i.e.,[eq15]Therefore,[eq16]

Joint moment generating function

The joint moment generating function of a standard MV-N random vector X is defined for any $tin U{211d} ^{K}$:[eq17]

Proof

The K components of X are K mutually independent standard normal random variables (see above). As a consequence, the joint mgf of X can be derived as follows:[eq18]where we have used the definition of the moment generating function of a random variable and the fact that the components of X are mutually independent. Since the moment generating function of a standard normal random variable is[eq19]the joint mgf of X is[eq20]Note that the mgf [eq21] of a standard normal random variable is defined for any $t_{i}in U{211d} $. As a consequence, the joint mgf of X is defined for any $tin U{211d} ^{K}$.

Joint characteristic function

The joint characteristic function of a standard MV-N random vector X is[eq22]

Proof

The K components of X are K mutually independent standard normal random variables (see above). As a consequence, the joint characteristic function of X can be derived as follows:[eq23]where we have used the definition of the joint characteristic function of a random variable and the fact that the components of X are mutually independent. Since the characteristic function of a standard normal random variable is[eq24]then the joint characteristic function of X is[eq25]

The multivariate normal distribution in general

While in the previous section we restricted our attention to the multivariate normal distribution with zero mean and unit covariance, we now deal with the general case.

Definition

Multivariate normal random vectors are characterized as follows.

Definition Let X be a Kx1 continuous random vector. Let its support be the set of K-dimensional real vectors:[eq26]Let mu be a Kx1 vector and V a $K	imes K$ symmetric and positive definite matrix. We say that X has a multivariate normal distribution with mean mu and covariance V if its joint probability density function is[eq27]

We indicate that X has a multivariate normal distribution with mean mu and covariance V by[eq28]

The K random variables [eq29] constituting the vector X are said to be jointly normal.

Relation between standard and general

A random vector having a MV-N distribution with mean mu and covariance $V $ is just a linear function of a "standard" MV-N vector:

Proposition Let X be a Kx1 random vector having a MV-N distribution with mean mu and covariance V. Then,[eq30]where Z is a standard MV-N Kx1 vector and Sigma is a $K	imes K $ invertible matrix such that [eq31].

Proof

This is proved using the formula for the joint density of a linear function of a continuous random vector ([eq32] is a linear one-to-one mapping since Sigma is invertible):[eq33]The existence of a matrix Sigma satisfying [eq34] is guaranteed by the fact that V is symmetric and positive definite.

Expected value

The expected value of a MV-N random vector X is[eq35]

Proof

This is an immediate consequence of the fact that $X=mu +Sigma Z$ (where Z has a multivariate standard normal distribution) and of the linearity of the expected value:[eq36]

Covariance matrix

The covariance matrix of a MV-N random vector X is[eq37]

Proof

This is an immediate consequence of the fact that $X=mu +Sigma Z$ (where Z has a multivariate standard normal distribution) and of the Addition to constant vectors and Multiplication by constant matrices properties of the covariance matrix:[eq38]

Joint moment generating function

The joint moment generating function of a MV-N random vector X is defined for any $tin U{211d} ^{K}$:[eq39]

Proof

This is an immediate consequence of the fact that $X=mu +Sigma Z$ (where Z has a multivariate standard normal distribution and Sigma is a $K	imes K$ invertible matrix such that [eq40]) and of the rule for deriving the joint mgf of a linear transformation:[eq41]

Joint characteristic function

The joint characteristic function of a MV-N random vector X is[eq42]

Proof

This is an immediate consequence of the fact that $X=mu +Sigma Z$ (where Z has a multivariate standard normal distribution and Sigma is a $K	imes K$ invertible matrix such that [eq40]) and of the rule for deriving the joint characteristic function of a linear transformation:[eq44]

More details

The following sections contain more details about the MV-N distribution.

The univariate normal as a special case

The univariate normal distribution is just a special case of the multivariate normal distribution: setting $K=1$ in the joint density function of the multivariate normal distribution one obtains the density function of the univariate normal distribution (remember that the determinant and the transpose of a scalar are equal to the scalar itself).

Mutually independent normal random variables are jointly normal

Let [eq45] be K mutually independent random variables all having a normal distribution. Denote by $mu _{i}$ the mean of X_i and by $sigma _{i}^{2}$ its variance. Then the Kx1 random vector X defined as[eq46]has a multivariate normal distribution with mean [eq47]and covariance matrix [eq48]

This can be proved by showing that the product of the probability density functions of [eq49] is equal to the joint probability density function of X (this is left as an exercise).

Read more

The following lectures contain more material about the multivariate normal distribution.

Linear combinations of normal variables

Discusses the fact that linear transformations of MV-N random vectors are also MV-N

Partitioned multivariate normal distribution

Discusses some facts about partioned MV-N vectors

Quadratic forms involving normal variables

Discusses the distribution of quadratic forms involving MV-N vectors

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let [eq50] be a multivariate normal random vector with mean [eq51]and covariance matrix[eq52]Prove that the random variable[eq53]has a normal distribution with mean equal to $3$ and variance equal to $7$.

Hint: use the joint moment generating function of X and its properties.

Solution

The random variable Y can be written as[eq54]where [eq55]

Using the formula for the joint moment generating function of a linear transformation of a random vector [eq56]and the fact that the mgf of a multivariate normal vector X is [eq57]we obtain [eq58]where, in the last step, we have also used the fact that $t$ is a scalar, because Y is unidimensional. Now[eq59]and[eq60]Plugging the values just obtained into the formula for the mgf of Y, we get[eq61]But this is the moment generating function of a normal random variable with mean equal to $3$ and variance equal to $7$ (see the lecture entitled Normal distribution). Therefore, Y is a normal random variable with mean equal to $3$ and variance equal to $7$ (remember that a distribution is completely characterized by its moment generating function).

Exercise 2

Let [eq62] be a multivariate normal random vector with mean [eq63]and covariance matrix[eq64]Using the joint moment generating function of X, derive the cross-moment[eq65]

Solution

The joint mgf of X is [eq66]The third-order cross-moment we want to compute is equal to a third partial derivative of the mgf, evaluated at zero:[eq67]The partial derivatives are[eq68]

Thus,[eq69]

How to cite

Please cite as:

Taboga, Marco (2021). "Multivariate normal distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/probability-distributions/multivariate-normal-distribution.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.