Search for probability and statistics terms on Statlect
StatLect

Relations among modes of convergence

by , PhD

In the previous lectures, we have introduced several notions of convergence of a sequence of random variables (also called modes of convergence). There are several relations among the various modes of convergence, which are discussed below and are summarized by the following diagram (an arrow denotes implication in the arrow's direction):[eq1].

Table of Contents

Almost sure convergence implies convergence in probability

If a sequence of random variables [eq2] converges almost surely to a random variable X, then [eq2] also converges in probability to X.

Proof

See, e.g., Resnick (1999).

Convergence in probability implies convergence in distribution

If a sequence of random variables [eq2] converges in probability to a random variable X, then [eq2] also converges in distribution to X.

Proof

See, for example, Resnick (1999).

Almost sure convergence implies convergence in distribution

If a sequence of random variables [eq2] converges almost surely to a random variable X, then [eq2] also converges in distribution to X.

Proof

This is obtained putting together the previous relations (almost sure convergence implies convergence in probability, which in turn implies convergence in distribution).

Mean square convergence implies convergence in probability

If a sequence of random variables [eq2] converges in mean square to a random variable X, then [eq2] also converges in probability to X.

Proof

We can apply Markov inequality to a generic term of the sequence [eq10]:[eq11]for any strictly positive real number $c$. Taking the square root of both sides of the left-hand inequality, we obtain[eq12]Taking limits on both sides, we get[eq13]where we have used the fact that, by the very definition of convergence in mean square,[eq14]Since, by the definition of probability, it must be that[eq15]then it must be that also[eq16]Note that this holds for any arbitrarily small $c$. By the definition of convergence in probability, this means that X_n converges in probability to X (if you are wondering about strict and weak inequalities here and in the definition of convergence in probability, note that [eq17] implies [eq18] for any strictly positive $arepsilon <c$).

Mean square convergence implies convergence in distribution

If a sequence of random variables [eq2] converges in mean square to a random variable X, then [eq2] also converges in distribution to X.

Proof

This is obtained putting together the previous relations (mean square convergence implies convergence in probability, which in turn implies convergence in distribution).

References

Resnick, S. I. (1999) A probability path, Birkhauser.

How to cite

Please cite as:

Taboga, Marco (2021). "Relations among modes of convergence", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/asymptotic-theory/relations-among-modes-of-convergence.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.