# Functions of random vectors and their distribution

Let be a random vector with known distribution. Let a random vector be a function of :where . How do we derive the distribution of from the distribution of ?

Although there is no general answer to this question, there are some special cases in which the distribution of can be easily derived from the distribution of . We discuss these cases below.

## One-to-one functions

In the cases in which the function is one-to-one (hence invertible) and the random vector is either discrete or continuous, there are readily applicable formulae for the distribution of . We report these formulae below.

### One-to-one function of a discrete random vector

When is a discrete random vector the joint probability mass function of is given by the following proposition.

Proposition (probability mass of a one-to-one function) Let be a discrete random vector with support and joint probability mass function . Let be one-to-one on the support of . Then, the support of isand its probability mass function is

Proof

If , thenIf , then trivially .

Example Let be a discrete random vector and denote its components by and . Let the support of be and its joint probability mass function beLetThe support of isThe inverse function isThe joint probability mass function of is

### One-to-one function of a continuous random vector

When is a continuous random vector and is differentiable, then also is continuous and its joint probability density function is given by the following proposition.

Proposition (density of a one-to-one function) Let be a continuous random vector with support and joint probability density function . Let be one-to-one and differentiable on the support of . Denote by the Jacobian matrix of , i.e.,where is the -th component of and is the -th component of . Then, the support of isIf the determinant of the Jacobian matrix satisfiesthen the joint probability density function of is

Proof

See: Poirier, D. J. (1995) Intermediate statistics and econometrics: a comparative approach, MIT Press.

A special case of the above proposition obtains when the function is a linear one-to-one mapping.

Proposition Let be a continuous random vector with joint probability density . Let be a random vector such thatwhere is a constant vector and is a constant invertible matrix. Then, is a continuous random vector whose probability density function satisfieswhere is the determinant of .

Proof

In this case the inverse function isThe Jacobian matrix isWhen the joint density of is

Example Let be a random vector with supportand joint probability density functionwhere and are the two components of . Define a random vector with components and as follows:The inverse function is defined byThe Jacobian matrix of isIts determinant isThe support of isThe support of isand the support of isFor , the joint probability density function of iswhile for , the joint probability density function is .

## Independent sums

When the components of are independent andthen the distribution of can be derived using the convolution formulae illustrated in the lecture entitled Sums of independent random variables.

## Known moment generating function

The joint moment generating function of , provided it exists, can be computed asusing the transformation theorem. If is recognized as the joint moment generating function of a known distribution, then such a distribution is the distribution of (two random vectors have the same distribution if and only if they have the same joint moment generating function, provided the latter exists).

## Known characteristic function

The joint characteristic function of can be computed asusing the transformation theorem. If is recognized as the joint characteristic function of a known distribution, then such a distribution is the distribution of (two random vectors have the same distribution if and only if they have the same joint characteristic function).

## Solved exercises

Below you can find some exercises with explained solutions.

### Exercise 1

Let be a uniform random variable with supportand probability density functionLet be a continuous random variable, independent of , with supportand probability density functionLet Find the joint probability density function of the random vector

Solution

Since and are independent, their joint probability density function is equal to the product of their marginal density functions:The support of isand the support of isThe support of isThe function is one-to-one and its inverse is defined bywith Jacobian matrixThe determinant of the Jacobian matrix iswhich is different from zero for any belonging to . The formula for the joint probability density function of isandwhich implies

### Exercise 2

Let be a random vector with supportand joint probability density functionwhere and are the two components of . Define a random vector with components and as follows:Find the joint probability density function of the random vector .

Solution

The inverse function is defined byThe Jacobian matrix of isIts determinant isThe support of isThe support of isThe support of isFor , the joint probability density function of iswhile for , the joint probability density function is .