Search for probability and statistics terms on Statlect
StatLect

Transformation theorem

by , PhD

A transformation theorem is one of several related results about the moments and the probability distribution of a transformation of a random variable (or vector).

Table of Contents

The transformation

Suppose that X is a random variable whose distribution is known.

Given a function $g$, how do we derive the distribution of [eq1]?

Formulae for one-to-one functions

If the function $g$ is one-to-one (e.g., strictly increasing or strictly decreasing), there are formulae for the probability mass (or density) and the distribution function of Y.

These formulae, sometimes called transformation theorems, are explained and proved in the lecture on functions of random variables.

Their generalization to the multivariate case (when X is a random vector) are discussed in the lecture on functions of random vectors.

The formulae for the joint density and mass functions of one-to-one transformations.

Law of the unconscious statistician

When the function $g$ is not one-to-one and there are no simple ways to derive the distribution of Y, we can nonetheless easily compute the expected value and other moments of Y, thanks to the so-called Law Of the Unconscious Statistician (LOTUS).

The LOTUS, illustrated below, is also often called transformation theorem.

LOTUS for discrete random variables

For discrete random variables, the theorem is as follows.

Proposition Let X be a discrete random variable and [eq2] a function. Define[eq3]Then,[eq4]where R_X is the support of X and [eq5] is its probability mass function.

Note that the above formula does not require us to know the support and the probability mass function of $Y\,$, unlike the standard formula[eq6]

LOTUS for continuous random variables

For continuous random variables, the theorem is as follows.

Proposition Let X be a continuous random variable and [eq7] a function. Define[eq3]Then,[eq9]where [eq10] is the probability density function of X.

Again, the above formula does not require us to know the probability density function of $Y\,$, unlike the standard formula[eq11]

LOTUS for other moments

The LOTUS can be used to compute any moment of Y, provided that the moment exists:[eq12]

LOTUS for moment generating and characteristic functions

The LOTUS can be used to compute the moment generating function (mgf) [eq13]

The mgf completely characterizes the distribution of Y.

If we are able to calculate the above expected value and we recognize that [eq14] is the joint mgf of a known distribution, then that distribution is the distribution of Y. In fact, two random variables have the same distribution if and only if they have the same mgf, provided the latter exists.

Similar comments apply to the characteristic function[eq15]

More details

More details about the transformation theorem can be found in the following lectures:

References

Abadir, K.M. and Magnus, J.R., 2007. A statistical proof of the transformation theorem. The Refinement of Econometric Estimation and Test Procedures, Cambridge University Press.

Goldstein, J.A., 2004. An appreciation of my teacher, MM Rao. In Stochastic Processes and Functional Analysis (pp. 31-34). CRC Press.

Schervish, M.J., 2012. Theory of statistics. Springer Science & Business Media.

Keep reading the glossary

Previous entry: Test statistic

Next entry: Type I error

How to cite

Please cite as:

Taboga, Marco (2021). "Transformation theorem", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/glossary/transformation-theorem.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.