Suppose you perform an experiment with two possible outcomes: either success or failure. Success happens with probability , while failure happens with probability . A random variable that takes value in case of success and in case of failure is called a Bernoulli random variable (alternatively, it is said to have a Bernoulli distribution).
Bernoulli random variables are characterized as follows:
A random variable having a Bernoulli distribution is also called a Bernoulli random variable.
Note that, by the above definition, any indicator function is a Bernoulli random variable.
The following is a proof that is a legitimate probability mass function:
Non-negativity is obvious. We need to prove that the sum of over its support equals . This is proved as follows:
The expected value of a Bernoulli random variable is:
It can be derived as follows:
The variance of a Bernoulli random variable is:
It can be derived thanks to the usual variance formula ():
The moment generating function of a Bernoulli random variable is defined for any :
Using the definition of moment generating function:Obviously, the above expected value exists for any .
The characteristic function of a Bernoulli random variable is:
Using the definition of characteristic function:
The distribution function of a Bernoulli random variable is:
Remember the definition of distribution function:and the fact that can take either value or value . If , then , because can not take values strictly smaller than . If , then , because is the only value strictly smaller than that can take. Finally, if , then , because all values can take are smaller than or equal to .
A sum of independent Bernoulli random variables is a binomial random variable. This is discussed and proved in the lecture entitled Binomial distribution.
Below you can find some exercises with explained solutions:
Most learning materials found on this website are now available in a traditional textbook format.