Suppose you perform an experiment with two possible outcomes: either success or failure. Success happens with probability , while failure happens with probability . A random variable that takes value in case of success and in case of failure is called a Bernoulli random variable (alternatively, it is said to have a Bernoulli distribution).

Bernoulli random variables are characterized as follows.

Definition
Let
be a discrete random
variable. Let its
support
beLet
.
We say that
has a **Bernoulli distribution **with parameter
if its probability mass
function
is

A random variable having a Bernoulli distribution is also called a Bernoulli random variable.

Note that, by the above definition, any indicator function is a Bernoulli random variable.

The following is a proof that is a legitimate probability mass function.

Proof

Non-negativity is obvious. We need to prove that the sum of over its support equals . This is proved as follows:

The expected value of a Bernoulli random variable is

Proof

It can be derived as follows:

The variance of a Bernoulli random variable is

Proof

It can be derived thanks to the usual variance formula ():

The moment generating function of a Bernoulli random variable is defined for any :

Proof

Using the definition of moment generating function, we getObviously, the above expected value exists for any .

The characteristic function of a Bernoulli random variable is

Proof

Using the definition of characteristic function, we obtain

The distribution function of a Bernoulli random variable is

Proof

Remember the definition of distribution function:and the fact that can take either value or value . If , then because can not take values strictly smaller than . If , then because is the only value strictly smaller than that can take. Finally, if , then because all values can take are smaller than or equal to .

A sum of independent Bernoulli random variables is a binomial random variable. This is discussed and proved in the lecture entitled Binomial distribution.

Below you can find some exercises with explained solutions.

Let and be two independent Bernoulli random variables with parameter . Derive the probability mass function of their sum

Solution

The probability mass function of isThe probability mass function of isThe support of (the set of values can take) isThe convolution formula for the probability mass function of a sum of two independent variables iswhere is the support of . When , the formula givesWhen , the formula givesWhen , the formula givesTherefore, the probability mass function of is

Let be a Bernoulli random variable with parameter . Find its tenth moment.

Solution

The moment generating function of isThe tenth moment of is equal to the tenth derivative of its moment generating function, evaluated at :Butso that

The book

Most learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Gamma distribution
- Uniform distribution
- Convergence in probability
- Combinations
- Central Limit Theorem
- Mean square convergence

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Almost sure
- Power function
- Mean squared error
- Posterior probability
- Continuous random variable
- Critical value

Share