# Slutsky's theorem

Slutsky's theorem concerns the convergence in distribution of the transformation of two sequences of random vectors, one converging in distribution and the other converging in probability to a constant.

## Joint convergence in distribution

Slutsky's theorem is based on the fact that if a sequence of random vectors converges in distribution and another sequence converges in probability to a constant, then they are jointly convergent in distribution.

Proposition (Joint convergence) Let and be two sequences of random vectors. If and , where is a constant, then

Proof

In the proposition above we have indicated convergence in probability by and convergence in distribution by .

## The theorem

We provide a statement of Slutsky's theorem that is slightly more general than the statement usually found in standard references.

Proposition (Slutsky) Let and be two sequences of random vectors such that and , where is a constant. Let be a continuous function. Then,

Proof

The couple is jointly convergent in distribution to by the proposition above (Joint convergence). Therefore, by the Continuous Mapping theorem, the fact that is continuous implies that converges in distribution to .

The theorem is valid also when and are sequences of random matrices (the reason being that random matrices can be thought of as random vectors whose entries have been re-arranged into several columns).

## Implications

Since the sum and the product are continuous functions of their operands, Slutsky's theorem implies thatwhen , and the dimensions of and are such that their sum and/or their product are well-defined.

## Solved exercises

Below you can find some exercises with explained solutions.

### Exercise 1

Let be a sequence of random vectors such thatwhere is a normal random vector with mean and invertible covariance matrix .

Let be a sequence of random matrices such thatwhere is a constant matrix. Find the limit in distribution of the sequence of products .

Solution

By Slutsky's theoremwhereThe random vector has a multivariate normal distribution, because it is a linear transformation of a multivariate normal random vector (see the lecture entitled Linear combinations of normal random variables). The expected value of isand its covariance matrix isTherefore, the sequence of products converges in distribution to a multivariate normal random vector with mean and covariance matrix .

### Exercise 2

Let be a sequence of random vectors such thatwhere is a normal random vector with mean and invertible covariance matrix .

Let be a sequence of random matrices such thatFind the limit in distribution of the sequence

Solution

By the Continuous Mapping theoremTherefore, by Slutsky's theoremUsing the Continuous mapping theorem again, we getSince is an invertible covariance matrix, there exists an invertible matrix such thatTherefore,where we have definedThe random vector has a multivariate normal distribution, because it is a linear transformation of a multivariate normal random vector (see the lecture entitled Linear combinations of normal random variables). The expected value of isand its covariance matrix isThus, has a standard multivariate normal distribution (mean and variance ) andis a quadratic form in a standard normal random vector. So, has a Chi-square distribution with degrees of freedom. In summary, the sequence converges in distribution to a Chi-square distribution with degrees of freedom.

### Exercise 3

Let everything be as in the previous exercise, except for the fact that now has mean . Find the limit in distribution of the sequencewhere is a sequence of random vectors converging in probability to .

Solution

DefineBy Slutsky's theoremwhereis a multivariate normal random variable with mean and variance . Thus, we can use the results of the previous exercise on the sequencewhich is the same asand we find that it converges in distribution to a Chi-square distribution with degrees of freedom.

## References

van der Vaart, A. W. (2000) Asymptotic Statistics, Cambridge University Press.