This lecture discusses how to derive the distribution of the sum of two independent random variables. We explain first how to derive the distribution function of the sum and then how to derive its probability mass function (if the summands are discrete) or its probability density function (if the summands are continuous).
The following proposition characterizes the distribution function of the sum in terms of the distribution functions of the two summands.
Proposition Let and be two independent random variables and denote by and their distribution functions. Letand denote the distribution function of by . The following holds:or
The first formula is derived as follows:The second formula is symmetric to the first.
Example Let be a uniform random variable with support and probability density functionand another uniform random variable, independent of , with support and probability density functionThe distribution function of isThe distribution function of isThere are four cases to consider:
If , then
If , then
If , then
If , then
By combining these four possible cases, we obtain
When the two summands are discrete random variables, the probability mass function of their sum can be derived as follows.
Proposition Let and be two independent discrete random variables and denote by and their respective probability mass functions and by and their supports. Letand denote the probability mass function of by . The following holds:or
The first formula is derived as follows:The second formula is symmetric to the first.
The two summations above are called convolutions (of two probability mass functions).
Example Let be a discrete random variable with support and probability mass functionand another discrete random variable, independent of , with support and probability mass functionDefineIts support is The probability mass function of , evaluated at isEvaluated at , it isEvaluated at , it isTherefore, the probability mass function of is
When the two summands are continuous random variables, the probability density function of their sum can be derived as follows.
Proposition Let and be two independent continuous random variables and denote by and their respective probability density functions. Let:and denote the probability density function of by . The following holds:or
The distribution function of a sum of independent variables isDifferentiating both sides and using the fact that the density function is the derivative of the distribution function, we obtainThe second formula is symmetric to the first.
The two integrals above are called convolutions (of two probability density functions).
Example Let be an exponential random variable with support and probability density functionand another exponential random variable, independent of , with support and probability density functionDefine The support of isWhen , the probability density function of isTherefore, the probability density function of is
We have discussed above how to derive the distribution of the sum of two independent random variables. How do we derive the distribution of the sum of more than two mutually independent random variables? Suppose , , ..., are mutually independent random variables and let be their sum:The distribution of can be derived recursively, using the results for sums of two random variables given above:
first, defineand compute the distribution of ;
then, defineand compute the distribution of ;
and so on, until the distribution of can be computed from
Below you can find some exercises with explained solutions.
Let be a uniform random variable with support and probability density functionand an exponential random variable, independent of , with support and probability density functionDerive the probability density function of the sum
The support of isWhen , the probability density function of isTherefore, the probability density function of is
Let be a discrete random variable with support and probability mass functionand another discrete random variable, independent of , with support and probability mass functionDerive the probability mass function of the sum
The support of is The probability mass function of , evaluated at isEvaluated at , it isEvaluated at , it isEvaluated at , it isTherefore, the probability mass function of is
Please cite as:
Taboga, Marco (2017). "Sums of independent random variables", Lectures on probability theory and mathematical statistics, Third edition. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-probability/sums-of-independent-random-variables.
Most of the learning materials found on this website are now available in a traditional textbook format.