StatlectThe Digital Textbook
Index > Fundamentals of probability

Covariance matrix

Let X be a Kx1 random vector. The covariance matrix of X, or variance-covariance matrix of X, is denoted by [eq1]. It is defined as follows:[eq2]provided the above expected values exist and are well-defined.

It is a multivariate generalization of the definition of variance for a scalar random variable Y:[eq3]

Structure

Let X_1, ..., $X_{K}$ denote the K components of the vector X. From the definition of [eq4], it can easily be seen that [eq5] is a $K	imes K$ matrix with the following structure:[eq6]

Therefore, the covariance matrix of X is a square $K	imes K$ matrix whose generic $left( i,j
ight) $-th entry is equal to the covariance between X_i and $X_{j}$.

Since [eq7] when $i=j$, the diagonal entries of the covariance matrix are equal to the variances of the individual components of X.

Example Suppose X is a $2	imes 1$ random vector with components X_1 and $X_{2} $. Let[eq8]By the symmetry of covariance, it must also be [eq9]Therefore, the covariance matrix of X is[eq10]

Formula for computing the covariance matrix

The covariance matrix of a Kx1 random vector X can be computed as follows:[eq11]

Proof

The above formula can be derived as follows:[eq12]

This formula also makes clear that the covariance matrix exists and is well-defined only as long as the vector of expected values [eq13] and the matrix of second cross-moments [eq14] exist and are well-defined.

More details

The following subsections contain more details about the covariance matrix.

Addition to constant vectors

Let $ain U{211d} ^{K}$ be a constant Kx1 vector and let X be a Kx1 random vector. Then,[eq15]

Proof

This is a consequence of the fact that [eq16] (by linearity of the expected value):[eq17]

Multiplication by constant matrices

Let $b$ be a constant $M	imes K$ matrix and let X be a Kx1 random vector. Then,[eq18]

Proof

This is easily proved using the fact that [eq19] (by linearity of the expected value):[eq20]

Linear transformations

Let $ain U{211d} ^{K}$ be a constant Kx1 vector, $b$ be a constant $M	imes K$ matrix and X a Kx1 random vector. Then, combining the two properties above, one obtains[eq21]

Symmetry

The covariance matrix [eq5] is a symmetric matrix, that is, it is equal to its transpose:[eq23]

Semi-positive definiteness

The covariance matrix [eq5] is a positive-semidefinite matrix, that is, for any $1	imes K$ vector $ain U{211d} ^{K}$:[eq25]This is easily proved using the Multiplication by constant matrices property above:[eq26]where the last inequality follows from the fact that variance is always positive.

Covariance between linear transformations

Let a and $b$ be two constant $1	imes K$ vectors and X a Kx1 random vector. Then, the covariance between the two linear transformations $aX$ and $bX$ can be expressed as a function of the covariance matrix:[eq27]

Proof

This can be proved as follows:[eq28]

Cross-covariance

The term covariance matrix is sometimes also used to refer to the matrix of covariances between the elements of two vectors.

Let X be a Kx1 random vector and Y be a $L	imes 1$ random vector. The covariance matrix between X and Y, or cross-covariance between X and Y is denoted by [eq29]. It is defined as follows:[eq30]provided the above expected values exist and are well-defined.

It is a multivariate generalization of the definition of covariance between two scalar random variables.

Let X_1, ..., $X_{K}$ denote the K components of the vector X and $Y_{1}$, ..., $Y_{L}$ denote the $L$ components of the vector Y . From the definition of [eq31], it can easily be seen that [eq32] is a $K	imes L$ matrix with the following structure:[eq33]Note that [eq32] is not the same as [eq35]. In fact, [eq36] is a $L	imes K$ matrix equal to the transpose of [eq32]:[eq38]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let X be a $2	imes 1$ random vector and denote its components by X_1 and X_2. The covariance matrix of X is[eq39]Compute the variance of the random variable Y defined as[eq40]

Solution

By using a matrix notation, Y can be written as[eq41]where we have defined[eq42]Therefore, the variance of Y can be computed using the formula for the covariance matrix of a linear transformation:[eq43]

Exercise 2

Let X be a $3	imes 1$ random vector and denote its components by X_1, X_2 and $X_{3}$. The covariance matrix of X is[eq44]Compute the following covariance:[eq45]

Solution

Using the bilinearity of the covariance operator, we obtain[eq46]The same result can be obtained using the formula for the covariance between two linear transformations. Defining[eq47]we have[eq48]

Exercise 3

Let X be a Kx1 random vector whose covariance matrix is equal to the identity matrix:[eq49]Define a new random vector Y as follows:[eq50]where A is a $K	imes K$ matrix of constants such that[eq51]Derive the covariance matrix of Y.

Solution

By the formula for the covariance matrix of a linear transformation, we have[eq52]

The book

Most learning materials found on this website are now available in a traditional textbook format.