StatlectThe Digital Textbook
Index > Probability distributions

Bernoulli distribution

Suppose you perform an experiment with two possible outcomes: either success or failure. Success happens with probability p, while failure happens with probability $1-p$. A random variable that takes value 1 in case of success and 0 in case of failure is called a Bernoulli random variable (alternatively, it is said to have a Bernoulli distribution).

Definition

Bernoulli random variables are characterized as follows:

Definition Let X be a discrete random variable. Let its support be:[eq1]Let [eq2]. We say that X has a Bernoulli distribution with parameter p if its probability mass function is:[eq3]

A random variable having a Bernoulli distribution is also called a Bernoulli random variable.

Note that, by the above definition, any indicator function is a Bernoulli random variable.

The following is a proof that [eq4] is a legitimate probability mass function:

Proof

Non-negativity is obvious. We need to prove that the sum of [eq5] over its support equals 1. This is proved as follows:[eq6]

Expected value

The expected value of a Bernoulli random variable X is:[eq7]

Proof

It can be derived as follows:[eq8]

Variance

The variance of a Bernoulli random variable X is:[eq9]

Proof

It can be derived thanks to the usual variance formula ([eq10]):[eq11]

Moment generating function

The moment generating function of a Bernoulli random variable X is defined for any t in R:[eq12]

Proof

Using the definition of moment generating function:[eq13]Obviously, the above expected value exists for any t in R.

Characteristic function

The characteristic function of a Bernoulli random variable X is:[eq14]

Proof

Using the definition of characteristic function:[eq15]

Distribution function

The distribution function of a Bernoulli random variable X is:[eq16]

Proof

Remember the definition of distribution function:[eq17]and the fact that X can take either value 0 or value 1. If $x<0$, then [eq18], because X can not take values strictly smaller than 0. If [eq19], then [eq20], because 0 is the only value strictly smaller than 1 that X can take. Finally, if $xgeq 1$, then [eq21], because all values X can take are smaller than or equal to 1.

More details

Relation between the Bernoulli and the binomial distribution

A sum of independent Bernoulli random variables is a binomial random variable. This is discussed and proved in the lecture entitled Binomial distribution.

Solved exercises

Below you can find some exercises with explained solutions:

  1. Exercise set 1

The book

Most learning materials found on this website are now available in a traditional textbook format.