Suppose you perform an experiment with two possible outcomes: either success or failure. Success happens with probability , while failure happens with probability . A random variable that takes value in case of success and in case of failure is called a Bernoulli random variable (alternatively, it is said to have a Bernoulli distribution).

Bernoulli random variables are characterized as follows:

Definition
Let
be a discrete random
variable. Let its
support
be:Let
.
We say that
has a **Bernoulli distribution **with parameter
if its probability mass
function
is:

A random variable having a Bernoulli distribution is also called a Bernoulli random variable.

Note that, by the above definition, any indicator function is a Bernoulli random variable.

The following is a proof that is a legitimate probability mass function:

Proof

Non-negativity is obvious. We need to prove that the sum of over its support equals . This is proved as follows:

The expected value of a Bernoulli random variable is:

Proof

It can be derived as follows:

The variance of a Bernoulli random variable is:

Proof

It can be derived thanks to the usual variance formula ():

The moment generating function of a Bernoulli random variable is defined for any :

Proof

Using the definition of moment generating function:Obviously, the above expected value exists for any .

The characteristic function of a Bernoulli random variable is:

Proof

Using the definition of characteristic function:

The distribution function of a Bernoulli random variable is:

Proof

Remember the definition of distribution function:and the fact that can take either value or value . If , then , because can not take values strictly smaller than . If , then , because is the only value strictly smaller than that can take. Finally, if , then , because all values can take are smaller than or equal to .

A sum of independent Bernoulli random variables is a binomial random variable. This is discussed and proved in the lecture entitled Binomial distribution.

Below you can find some exercises with explained solutions:

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Wishart distribution
- Gamma function
- Delta method
- Convergence in distribution
- F distribution
- Gamma distribution

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Additional topics in probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics

About

Glossary entries

- Chebyshev inequality
- Jensen inequality
- Type II error
- Binomial coefficient
- Type I error
- Continuous mapping theorem

Share