Search for probability and statistics terms on Statlect
StatLect

Continuous Mapping theorem

by , PhD

The Continuous Mapping theorem states that stochastic convergence is preserved by continuous functions.

Table of Contents

The problem

Suppose that a sequence of random vectors [eq1] converges to a random vector X (in probability, in distribution or almost surely).

Now, take a transformed sequence [eq2], where $g$ is a function.

Under what conditions is [eq3] also a convergent sequence?

The Continuous Mapping theorem states that stochastic convergence is preserved if $g$ is a continuous function.

The theorem

Here is a statement of the multivariate version of the Continuous Mapping theorem.

Proposition Let [eq1] be a sequence of K-dimensional random vectors. Let [eq5] be a continuous function. Then,[eq6]where [eq7] denotes convergence in probability, [eq8] denotes almost sure convergence and [eq9] denotes convergence in distribution.

Proof

See, e.g., Shao (2003).

Consequences

The next sections present some important consequences of the Continuous Mapping theorem.

Sums and products of sequences converging in probability

An important implication of the Continuous Mapping theorem is that arithmetic operations preserve convergence in probability.

Proposition If [eq10] and [eq11]. Then,[eq12]

Proof

First of all, note that convergence in probability of [eq13] and of [eq14] implies their joint convergence in probability (see the lecture entitled Convergence in probability), that is, their convergence as a vector: [eq15]Now, the sum and the product are continuous functions of the operands. Thus, for example,[eq16]is a continuous function, and, by using the Continuous Mapping theorem, we obtain[eq17]where $QTR{rm}{plim}$ denotes a limit in probability.

Sums and products of sequences converging almost surely

Everything that was said in the previous subsection applies, with obvious modifications, also to almost surely convergent sequences.

Proposition If [eq18] and [eq19], then[eq20]

Proof

Similar to previous proof. Just replace convergence in probability with almost sure convergence.

Sums and products of sequences converging in distribution

For convergence almost surely and convergence in probability, the convergence of [eq1] and [eq14] individually implies their joint convergence as a vector (see the previous two proofs), but this is not the case for convergence in distribution. Therefore, to obtain preservation of convergence in distribution under arithmetic operations, we need the stronger assumption of joint convergence in distribution.

Proposition If [eq23]then[eq24]

Proof

Again, similar to the proof for convergence in probability, but this time joint convergence is already in the assumptions.

More details

The following sections contain more details about the Continuous Mapping theorem.

Convergence of ratios

As a byproduct of the propositions stated above, we also have the following proposition.

Proposition If a sequence of random variables [eq1] converges to X, then[eq26]provided X is almost surely different from 0 (we did not specify the kind of convergence, which can be in probability, almost surely or in distribution).

Proof

This is a consequence of the Continuous Mapping theorem and of the fact that [eq27]is a continuous function for $x
eq 0$.

An immediate consequence of the previous proposition follows.

Proposition If two sequences of random variables [eq1] and [eq29] converge to X and Y respectively, then[eq30]provided Y is almost surely different from 0. Convergence can be in probability, almost surely or in distribution (but the latter requires joint convergence in distribution of [eq1] and [eq29]).

Proof

This is a consequence of the fact that the ratio can be written as a product[eq33]The first operand of the product converges by assumption. The second converges because of the previous proposition. Therefore, their product converges because convergence is preserved under products.

Random matrices

The Continuous Mapping theorem applies also to random matrices because random matrices are just random vectors whose entries have been arranged into the columns of a matrix.

In particular:

Applications

The Continuous Mapping theorem has several important applications. For example, it is used to prove:

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Consider a sequence [eq1] of random variables converging in distribution to a random variable X having a standard normal distribution.

Consider the function [eq37]which is a continuous function.

Find the limit in distribution of the sequence [eq38].

Solution

The sequence[eq39]converges in distribution to $X^{2}$ by the Continuous Mapping theorem. But the square of a standard normal random variable has a Chi-square distribution with one degree of freedom. Therefore, the sequence [eq40] converges in distribution to a Chi-square distribution with one degree of freedom.

References

Shao, J. (2007) Mathematical statistics, Springer.

How to cite

Please cite as:

Taboga, Marco (2021). "Continuous Mapping theorem", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/asymptotic-theory/continuous-mapping-theorem.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.