# Sums of independent random variables

This lecture discusses how to derive the distribution of the sum of two independent random variables.

We explain:

## Cumulative distribution function of a sum

The next proposition characterizes the cumulative distribution function (cdf) of the sum.

Proposition Let and be two independent random variables. Denote their cdfs by and . Letand denote the cdf of by . Then,or

Proof

The first formula is derived as follows:The second formula is symmetric to the first.

Example Let be a uniform random variable with support and probability density functionLet be another uniform random variable, independent of , with support and probability density functionThe cdf of isThe cdf of isThere are four cases to consider:

1. If , then

2. If , then

3. If , then

4. If , then

By combining these four possible cases, we obtain

## Probability mass function of a sum

When the two summands are discrete random variables, the probability mass function (pmf) of their sum can be derived as follows.

Proposition Let and be two independent discrete random variables. Denote their respective pmfs by and , and their supports by and . Letand denote the pmf of by . Then,or

Proof

The first formula is derived as follows:The second formula is symmetric to the first.

The two summations above are called convolutions (of two pmfs).

Example Let be a discrete random variable with support and pmfLet be another discrete random variable, independent of , with support and pmfDefineIts support is The pmf of , evaluated at isEvaluated at , it isEvaluated at , it isTherefore, the pmf of is

## Probability density function of a sum

When the two summands are continuous variables, the probability density function (pdf) of their sum can be derived as follows.

Proposition Let and be two independent continuous random variables and denote their respective pdfs by and . Letand denote the pdf of by . Then,or

Proof

The distribution function of a sum of independent variables isDifferentiating both sides and using the fact that the density function is the derivative of the distribution function, we obtainThe second formula is symmetric to the first.

The two integrals above are called convolutions (of two pdfs).

Example Let be an exponential random variable with support and pdfLet be another exponential random variable, independent of , with support and pdfDefine The support of isWhen , the pdf of isTherefore, the pdf of is

## Sum of n independent random variables

We have discussed above how to work out the distribution of the sum of two independent random variables.

How do we derive the distribution of the sum of more than two mutually independent random variables?

Suppose that , , ..., are mutually independent random variables and let be their sum:

The distribution of can be derived recursively, using the results for sums of two random variables given above:

1. first, defineand compute the distribution of ;

2. then, defineand compute the distribution of ;

3. and so on, until the distribution of can be computed from

## Solved exercises

Below you can find some exercises with explained solutions.

### Exercise 1

Let be a uniform random variable with support and pdf

Let be an exponential random variable, independent of , with support and pdf

Derive the pdf of the sum

Solution

The support of isWhen , the pdf of isTherefore, the pdf of is

### Exercise 2

Let be a discrete random variable with support and pmf

Let be another discrete random variable, independent of , with support and pmf

Derive the pmf of the sum

Solution

The support of is The pmf of , evaluated at isEvaluated at , it isEvaluated at , it isEvaluated at , it isTherefore, the pmf of is