StatlectThe Digital Textbook
Index > Probability distributions

Exponential distribution

How much time will elapse before an earthquake occurs in a given region? How long do we need to wait before a customer enters our shop? How long will it take before a call center receives the next phone call? How long will a piece of machinery work without breaking down?

Questions such as these are often answered in probabilistic terms using the exponential distribution.

All these questions concern the time we need to wait before a given event occurs. If this waiting time is unknown, it is often appropriate to think of it as a random variable having an exponential distribution. Roughly speaking, the time X we need to wait before an event occurs has an exponential distribution if the probability that the event occurs during a certain time interval is proportional to the length of that time interval. More precisely, X has an exponential distribution if the conditional probability[eq1]is approximately proportional to the length $Delta t$ of the time interval comprised between the times $t$ and $t+Delta t$, for any time instant $t$. In most practical situations this property is very realistic and this is the reason why the exponential distribution is so widely used to model waiting times.

The exponential distribution is also related to the Poisson distribution. When the event can occur more than once and the time elapsed between two successive occurrences is exponentially distributed and independent of previous occurrences, the number of occurrences of the event within a given unit of time has a Poisson distribution (see the lecture entitled Poisson distribution for a more detailed explanation and an intuitive graphical representation of this fact).


The exponential distribution is characterized as follows:

Definition Let X be an absolutely continuous random variable. Let its support be the set of positive real numbers:[eq2]Let [eq3]. We say that X has an exponential distribution with parameter $lambda $ if its probability density function is:[eq4]The parameter $lambda $ is called rate parameter.

A random variable having an exponential distribution is also called an exponential random variable.

The following is a proof that [eq5] is a legitimate probability density function:


Non-negativity is obvious. We need to prove that the integral of [eq6] over R equals 1. This is proved as follows:[eq7]

To better understand the exponential distribution, you can have a look at its density plots.

The rate parameter and its interpretation

We have mentioned that the probability that the event occurs between two dates $t$ and $t+Delta t$ is proportional to $Delta t$ (conditional on the information that it has not occurred before $t$). The rate parameter $lambda $ is the constant of proportionality:[eq8]where [eq9] is an infinitesimal of higher order than $Delta t$ (i.e. a function of $Delta t$ that goes to zero more quickly than $Delta t$ does).

The above proportionality condition is also sufficient to completely characterize the exponential distribution:

Proposition The proportionality condition[eq10]is satisfied only if X has an exponential distribution.


The conditional probability [eq11] can be written as:[eq12]Denote by [eq13] the distribution function of X, i.e.:[eq14]and by [eq15] its survival function:[eq16]Then:[eq17]Dividing both sides by $-Delta t$, we obtain:[eq18]where $oleft( 1
ight) $ is a quantity that tends to 0 when $Delta t$ tends to 0. Taking limits on both sides, we obtain:[eq19]or, by the definition of derivative:[eq20]This differential equation is easily solved using the chain rule:[eq21]Taking the integral from 0 to x of both sides:[eq22]we obtain:[eq23]or:[eq24]But [eq25] (because X cannot take negative values) implies:[eq26]Exponentiating both sides:[eq27]Therefore:[eq28]or:[eq29]But the density function is the first derivative of the distribution function:[eq30]and the rightmost term is the density of an exponential random variable. Therefore, the proportionality condition is satisfied only if X is an exponential random variable

Expected value

The expected value of an exponential random variable X is:[eq31]


It can be derived as follows:[eq32]


The variance of an exponential random variable X is:[eq33]


It can be derived thanks to the usual variance formula ([eq34]):[eq35]

Moment generating function

The moment generating function of an exponential random variable X is defined for any $t<lambda $:[eq36]


Using the definition of moment generating function:[eq37]Of course, the above integrals converge only if [eq38], i.e. only if $t<lambda $. Therefore, the moment generating function of an exponential random variable exists for all $t<lambda $.

Characteristic function

The characteristic function of an exponential random variable X is:[eq39]


Using the definition of characteristic function and the fact that [eq40]we can write:[eq41]We now compute separately the two integrals. The first integral is:[eq42]Therefore:[eq43]which can be rearranged to yield:[eq44]or[eq45]The second integral is:[eq46]Therefore:[eq47]which can be rearranged to yield:[eq48]or[eq49]Putting pieces together:[eq50]

Distribution function

The distribution function of an exponential random variable X is:[eq51]


If $x<0$, then:[eq52]because X can not take on negative values. If $xgeq 0$, then:[eq53]

More details

In the following subsections you can find more details about the exponential distribution.

Memoryless property

One of the most important properties of the exponential distribution is the memoryless property: [eq54]for any $xgeq 0$.


This is proved as follows:[eq55]

X is the time we need to wait before a certain event occurs. The above property says that the probability that the event happens during a time interval of length $y$ is independent of how much time has already elapsed (x) without the event happening.

The sum of exponential random variables is a Gamma random variable

Suppose X_1, X_2, ..., X_n are n mutually independent random variables having exponential distribution with parameter $lambda $. Define:[eq56]Then, the sum Z is a Gamma random variable with parameters $2n$ and $frac{n}{lambda }$.


This is proved using moment generating functions (remember that the moment generating function of a sum of mutually independent random variables is just the product of their moment generating functions):[eq57]The latter is the moment generating function of a Gamma distribution with parameters $2n$ and $frac{n}{lambda }$. So Z has a Gamma distribution, because two random variables have the same distribution when they have the same moment generating function.

The random variable Z is also sometimes said to have an Erlang distribution. The Erlang distribution is just a special case of the Gamma distribution: a Gamma random variable is also an Erlang random variable when it can be written as a sum of exponential random variables.

Parameter estimation

Maximum likelihood estimation of the rate parameter $lambda $ is covered in the lecture entitled Exponential distribution - Maximum likelihood.

Solved exercises

Below you can find some exercises with explained solutions:

  1. Exercise set 1 (computation of probabilities involving the exponential distribution).

The book

Most learning materials found on this website are now available in a traditional textbook format.