The Bernoulli distribution is a univariate discrete distribution used to model random experiments that have binary outcomes.

Suppose that you perform an experiment with two possible outcomes: either success or failure.

Success happens with probability , while failure happens with probability .

A random variable that takes value in case of success and in case of failure is called a Bernoulli random variable (alternatively, it is said to have a Bernoulli distribution).

Bernoulli random variables are characterized as follows.

Definition
Let
be a discrete random
variable. Let its
support
beLet
.
We say that
has a **Bernoulli distribution **with parameter
if its probability mass
function
is

Note that, by the above definition, any indicator function is a Bernoulli random variable.

The following is a proof that is a legitimate probability mass function.

Proof

Non-negativity is obvious. We need to prove that the sum of over its support equals . This is proved as follows:

The expected value of a Bernoulli random variable is

Proof

It can be derived as follows:

The variance of a Bernoulli random variable is

Proof

It can be derived thanks to the usual variance formula ():

The moment generating function of a Bernoulli random variable is defined for any :

Proof

Using the definition of moment generating function, we getObviously, the above expected value exists for any .

The characteristic function of a Bernoulli random variable is

Proof

Using the definition of characteristic function, we obtain

The distribution function of a Bernoulli random variable is

Proof

Remember the definition of distribution function:and the fact that can take either value or value . If , then because can not take values strictly smaller than . If , then because is the only value strictly smaller than that can take. Finally, if , then because all values can take are smaller than or equal to .

A sum of independent Bernoulli random variables is a binomial random variable. This is discussed and proved in the lecture entitled Binomial distribution.

Below you can find some exercises with explained solutions.

Let and be two independent Bernoulli random variables with parameter . Derive the probability mass function of their sum

Solution

The probability mass function of isThe probability mass function of isThe support of (the set of values can take) isThe convolution formula for the probability mass function of a sum of two independent variables iswhere is the support of . When , the formula givesWhen , the formula givesWhen , the formula givesTherefore, the probability mass function of is

Let be a Bernoulli random variable with parameter . Find its tenth moment.

Solution

The moment generating function of isThe tenth moment of is equal to the tenth derivative of its moment generating function, evaluated at :Butso that

Please cite as:

Taboga, Marco (2021). "Bernoulli distribution", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/probability-distributions/Bernoulli-distribution.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Likelihood ratio test
- Conditional probability
- Gamma function
- Delta method
- Bayes rule
- Almost sure convergence

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Integrable variable
- Alternative hypothesis
- Almost sure
- Continuous mapping theorem
- Mean squared error
- Probability space

Share

- To enhance your privacy,
- we removed the social buttons,
- but
**don't forget to share**.