The multivariate normal (MV-N) distribution is a multivariate continuous distribution that generalizes the one-dimensional normal distribution.

Table of contents

In its simplest form, which is called the "standard" MV-N distribution, it describes the joint distribution of a random vector whose entries are mutually independent univariate normal random variables, all having zero mean and unit variance.

In its general form, it describes the joint distribution of a random vector that can be represented as a linear transformation of a standard MV-N vector.

The remainder of this lecture illustrates the main characteristics of the multivariate normal distribution, dealing first with the "standard" case and then with the more general case.

It is a common mistake to think that any set of normal random variables, when considered together, form a multivariate normal distribution. This is not the case.

In fact, it is possible to construct random vectors that are not MV-N, but whose individual elements have normal distributions.

The latter fact is very well-known in the theory of Copulae (a theory which allows us to specify the distribution of a random vector by first specifying the distribution of its components and then linking the univariate distributions through a function called copula).

The adjective "standard" is used to indicate that the mean of the distribution is equal to zero and its covariance matrix is equal to the identity matrix.

Standard MV-N random vectors are characterized as follows.

Definition
Let
be a
continuous random
vector. Let its
support be the set
of
-dimensional
real
vectors:We
say that
has a **standard multivariate normal distribution** if its
joint probability
density function
is

Denote the -th component of by . The joint probability density function can be written aswhere is the probability density function of a standard normal random variable:

Therefore, the components of are mutually independent standard normal random variables (a more detailed proof follows).

Proof

As we have seen, the joint probability density function can be written aswhere is the probability density function of a standard normal random variable:But is also the marginal probability density function of the -th component of :Therefore, the joint probability density function of is equal to the product of its marginals, which implies that the components of are mutually independent.

The expected value of a standard MV-N random vector is

Proof

All the components of are standard normal random variables and a standard normal random variable has mean .

The covariance matrix of a standard MV-N random vector iswhere is the identity matrix, i.e. a matrix whose diagonal elements are equal to 1 and whose off-diagonal entries are equal to .

Proof

This is proved using the structure of the covariance matrix:where is the -th component of . Since the components of are all standard normal random variables, their variances are all equal to , i.e.,Furthermore, since the components of are mutually independent and independence implies zero-covariance, all the covariances are equal to , i.e.,Therefore,

The joint moment generating function of a standard MV-N random vector is defined for any :

Proof

The components of are mutually independent standard normal random variables (see above). As a consequence, the joint mgf of can be derived as follows:where we have used the definition of the moment generating function of a random variable and the fact that the components of are mutually independent. Since the moment generating function of a standard normal random variable isthe joint mgf of isNote that the mgf of a standard normal random variable is defined for any . As a consequence, the joint mgf of is defined for any .

The joint characteristic function of a standard MV-N random vector is

Proof

The components of are mutually independent standard normal random variables (see above). As a consequence, the joint characteristic function of can be derived as follows:where we have used the definition of the joint characteristic function of a random variable and the fact that the components of are mutually independent. Since the characteristic function of a standard normal random variable isthen the joint characteristic function of is

While in the previous section we restricted our attention to the multivariate normal distribution with zero mean and unit covariance, we now deal with the general case.

Multivariate normal random vectors are characterized as follows.

Definition
Let
be a
continuous random vector. Let its support be the set of
-dimensional
real
vectors:Let
be a
vector and
a
symmetric and positive definite matrix. We say that
has a **multivariate normal distribution** with mean
and
covariance** **
if its joint probability density function
is

We indicate that has a multivariate normal distribution with mean and covariance by

The
random variables
constituting the vector
are said to be **jointly normal**.

A random vector having a MV-N distribution with mean and covariance is just a linear function of a "standard" MV-N vector:

Proposition Let be a random vector having a MV-N distribution with mean and covariance . Then,where is a standard MV-N vector and is a invertible matrix such that .

Proof

This is proved using the formula for the joint density of a linear function of a continuous random vector ( is a linear one-to-one mapping since is invertible):The existence of a matrix satisfying is guaranteed by the fact that is symmetric and positive definite.

The expected value of a MV-N random vector is

Proof

This is an immediate consequence of the fact that (where has a multivariate standard normal distribution) and of the linearity of the expected value:

The covariance matrix of a MV-N random vector is

Proof

This is an immediate consequence of the fact that (where has a multivariate standard normal distribution) and of the Addition to constant vectors and Multiplication by constant matrices properties of the covariance matrix:

The joint moment generating function of a MV-N random vector is defined for any :

Proof

This is an immediate consequence of the fact that (where has a multivariate standard normal distribution and is a invertible matrix such that ) and of the rule for deriving the joint mgf of a linear transformation:

The joint characteristic function of a MV-N random vector is

Proof

This is an immediate consequence of the fact that (where has a multivariate standard normal distribution and is a invertible matrix such that ) and of the rule for deriving the joint characteristic function of a linear transformation:

The following sections contain more details about the MV-N distribution.

The univariate normal distribution is just a special case of the multivariate normal distribution: setting in the joint density function of the multivariate normal distribution one obtains the density function of the univariate normal distribution (remember that the determinant and the transpose of a scalar are equal to the scalar itself).

Let be mutually independent random variables all having a normal distribution. Denote by the mean of and by its variance. Then the random vector defined ashas a multivariate normal distribution with mean and covariance matrix

This can be proved by showing that the product of the probability density functions of is equal to the joint probability density function of (this is left as an exercise).

The following lectures contain more material about the multivariate normal distribution.

Linear combinations of normal variables

Discusses the fact that linear transformations of MV-N random vectors are also MV-N

Partitioned multivariate normal distribution

Discusses some facts about partioned MV-N vectors

Quadratic forms involving normal variables

Discusses the distribution of quadratic forms involving MV-N vectors

Below you can find some exercises with explained solutions.

Let be a multivariate normal random vector with mean and covariance matrixProve that the random variablehas a normal distribution with mean equal to and variance equal to .

Hint: use the joint moment generating function of and its properties.

Solution

The random variable can be written aswhere

Using the formula for the joint moment generating function of a linear transformation of a random vector and the fact that the mgf of a multivariate normal vector is we obtain where, in the last step, we have also used the fact that is a scalar, because is unidimensional. NowandPlugging the values just obtained into the formula for the mgf of , we getBut this is the moment generating function of a normal random variable with mean equal to and variance equal to (see the lecture entitled Normal distribution). Therefore, is a normal random variable with mean equal to and variance equal to (remember that a distribution is completely characterized by its moment generating function).

Let be a multivariate normal random vector with mean and covariance matrixUsing the joint moment generating function of , derive the cross-moment

Solution

The joint mgf of is The third-order cross-moment we want to compute is equal to a third partial derivative of the mgf, evaluated at zero:The partial derivatives are

Thus,

Please cite as:

Taboga, Marco (2021). "Multivariate normal distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/probability-distributions/multivariate-normal-distribution.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Law of Large Numbers
- Multivariate normal distribution
- Combinations
- Wald test
- Almost sure convergence
- Binomial distribution

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- IID sequence
- Integrable variable
- Discrete random variable
- Alternative hypothesis
- Binomial coefficient
- Loss function

Share

- To enhance your privacy,
- we removed the social buttons,
- but
**don't forget to share**.