Search for probability and statistics terms on Statlect
StatLect

Independent random variables

by , PhD

Two random variables are independent if they convey no information about each other and, as a consequence, receiving information about one of the two does not change our assessment of the probability distribution of the other.

This lecture provides a formal definition of independence and discusses how to verify whether two or more random variables are independent.

Table of Contents

Definition

Recall (see the lecture entitled Independent events) that two events A and $B$ are independent if and only if[eq1]

This definition is extended to random variables as follows.

Definition Two random variables X and Y are said to be independent if and only if[eq2]for any couple of events [eq3]and [eq4], where $Asubseteq $ R and $Bsubseteq $ R.

In other words, two random variables are independent if and only if the events related to those random variables are independent events.

The independence between two random variables is also called statistical independence.

Independence criterion

Checking the independence of all possible couples of events related to two random variables can be very difficult. This is the reason why the above definition is seldom used to verify whether two random variables are independent. The following criterion is more often used instead.

Proposition Two random variables X and Y are independent if and only if[eq5]where [eq6] is their joint distribution function and [eq7] and [eq8] are their marginal distribution functions/.

Proof

By using some facts from measure theory (not proved here), it is possible to demonstrate that, when checking for the condition[eq9]it is sufficient to confine attention to sets A and $B$ taking the form[eq10]Thus, two random variables are independent if and only if[eq11]Using the definitions of joint and marginal distribution function, this condition can be written as[eq12]

Example Let X and Y be two random variables with marginal distribution functions[eq13]and joint distribution function[eq14]X and Y are independent if and only if [eq15]which is straightforward to verify. When $x<0$ or $y<0$, then [eq16] When $xgeq 0$ and $ygeq 0$, then:[eq17]

Independence between discrete random variables

When the two variables, taken together, form a discrete random vector, independence can also be verified using the following proposition:

Proposition Two random variables X and Y, forming a discrete random vector, are independent if and only if[eq18]where [eq19] is their joint probability mass function and [eq20] and [eq21] are their marginal probability mass functions.

The following example illustrates how this criterion can be used.

Example Let [eq22] be a discrete random vector with support [eq23]Let its joint probability mass function be[eq24]In order to verify whether X and Y are independent, we first need to derive the marginal probability mass functions of X and Y. The support of X is[eq25]and the support of Y is[eq26]We need to compute the probability of each element of the support of X:[eq27]Thus, the probability mass function of X is[eq28]We need to compute the probability of each element of the support of Y:[eq29]Thus, the probability mass function of Y is[eq30]The product of the marginal probability mass functions is[eq31]which is obviously different from [eq32]. Therefore, X and Y are not independent.

Independence between continuous random variables

When the two variables, taken together, form a continuous random vector, independence can also be verified by means of the following proposition.

Proposition Two random variables X and Y, forming a continuous random vector, are independent if and only if[eq33]where [eq34] is their joint probability density function and [eq35] and [eq36] are their marginal probability density functions.

The following example illustrates how this criterion can be used.

Example Let the joint probability density function of X and Y be[eq37]Its marginals are[eq38]and[eq39]Verifying that [eq40] is straightforward. When [eq41] or [eq42], then [eq43]. When [eq44] and [eq45], then[eq46]

More details

The following subsections contain more details about statistical independence.

Mutually independent random variables

The definition of mutually independent random variables extends the definition of mutually independent events to random variables.

Definition We say that n random variables X_1, ..., X_n are mutually independent (or jointly independent) if and only if [eq47]for any sub-collection of k random variables $X_{i_{1}}$, ..., $X_{i_{k}}$ (where $kleq n$) and for any collection of events [eq48], where [eq49].

In other words, n random variables are mutually independent if the events related to those random variables are mutually independent events.

Denote by X a random vector whose components are X_1, ..., X_n. The above condition for mutual independence can be replaced:

  1. in general, by a condition on the joint distribution function of X:[eq50]

  2. for discrete random variables, by a condition on the joint probability mass function of X:[eq51]

  3. for continuous random variables, by a condition on the joint probability density function of X:[eq52]

Mutual independence via expectations

It can be proved that n random variables X_1, ..., X_n are mutually independent if and only if[eq53]for any n functions $g_{1}$, ..., $g_{n}$ such that the above expected values exist and are well-defined.

Independence and zero covariance

If two random variables X_1 and X_2 are independent, then their covariance is zero:[eq54]

Proof

This is an immediate consequence of the fact that, if X_1 and X_2 are independent, then[eq55](see the Mutual independence via expectations property above). When $g_{1}$ and $g_{2}$ are identity functions ([eq56] and [eq57]), then[eq58]Therefore, by the covariance formula:[eq59]

The converse is not true: two random variables that have zero covariance are not necessarily independent.

Independent random vectors

The above notions are easily generalized to the case in which X and Y are two random vectors, having dimensions $K_{X}	imes 1$ and $K_{Y}	imes 1$ respectively. Denote their joint distribution functions by [eq60] and [eq61] and the joint distribution function of X and Y together by [eq62]Also, if the two vectors are discrete or continuous replace F with p or $f$ to denote the corresponding probability mass or density functions.

Definition Two random vectors X and Y are independent if and only if one of the following equivalent conditions is satisfied:

  1. Condition 1:[eq9]for any couple of events [eq3]and [eq4], where $Asubseteq $ $U{211d} ^{K_{X}}$ and $Bsubseteq $ $U{211d} ^{K_{Y}}$:

  2. Condition 2:[eq66]for any [eq67] and $yin $ $U{211d} ^{K_{Y}}$ (replace F with p or $f$ when the distributions are discrete or continuous respectively)

  3. Condition 3:[eq68]for any functions $g_{1}:$ [eq69] R and $g_{2}:$ [eq70] R such that the above expected values exist and are well-defined.

Mutually independent random vectors

Also the definition of mutual independence extends in a straightforward manner to random vectors.

Definition We say that n random vectors X_1, ..., X_n are mutually independent (or jointly independent) if and only if[eq71]for any sub-collection of k random vectors $X_{i_{1}}$, ..., $X_{i_{k}} $ (where $kleq n$) and for any collection of events [eq72].

All the equivalent conditions for the joint independence of a set of random variables (see above) apply with obvious modifications also to random vectors.

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Consider two random variables X and Y having marginal distribution functions[eq73]If X and Y are independent, what is their joint distribution function?

Solution

For X and Y to be independent, their joint distribution function must be equal to the product of their marginal distribution functions:[eq74]

Exercise 2

Let [eq75] be a discrete random vector with support:[eq76]Let its joint probability mass function be[eq77]Are X and Y independent?

Solution

In order to verify whether X and Y are independent, we first need to derive the marginal probability mass functions of X and Y. The support of X is[eq78]and the support of Y is[eq79]We need to compute the probability of each element of the support of X:[eq80]Thus, the probability mass function of X is[eq81]We need to compute the probability of each element of the support of Y:[eq82]Thus, the probability mass function of Y is[eq83]The product of the marginal probability mass functions is[eq84]which is equal to [eq32]. Therefore, X and Y are independent.

Exercise 3

Let [eq86] be a continuous random vector with support [eq87]and its joint probability density function be[eq88]Are X and Y independent?

Solution

The support of Y is[eq89]When [eq90], the marginal probability density function of Y is 0, while, when [eq91], the marginal probability density function of Y is[eq92]Thus, summing up, the marginal probability density function of Y is[eq93]The support of X is[eq94]When [eq95], the marginal probability density function of X is 0, while, when [eq96], the marginal probability density function of X is[eq97]Thus, the marginal probability density function of X is[eq98]Verifying that [eq40] is straightforward. When [eq95] or [eq101], then [eq43]. When [eq103] and [eq104], then[eq105]Thus, X and Y are independent.

How to cite

Please cite as:

Taboga, Marco (2021). "Independent random variables", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/independent-random-variables.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.