This lecture explains how to derive the maximum likelihood estimator (MLE) of the parameter of a Poisson distribution. Before reading this lecture, you might want to revise the lectures about maximum likelihood estimation and about the Poisson distribution.

We assume to observe inependent draws from a Poisson distribution. In more formal terms, we observe the first terms of an IID sequence of Poisson random variables. Thus, the probability mass function of a term of the sequence iswhere is the support of the distribution and is the parameter of interest (for which we want to derive the MLE). Remember that the support of the Poisson distribution is the set of non-negative integer numbers:

To keep things simple, we do not show, but we rather assume that the regularity conditions needed for the consistency and asymptotic normality of the maximum likelihood estimator of are satisfied.

The likelihood function is

Proof

The observations are independent. As a consequence, the likelihood function is equal to the product of their probability mass functions:Furthermore, the observed values necessarily belong to the support . So, we have

The log-likelihood function is

Proof

By taking the natural logarithm of the likelihood function derived above, we get the log-likelihood:

The maximum likelihood estimator of is

Proof

The MLE is the solution of the following maximization problem The first order condition for a maximum is The first derivative of the log-likelihood with respect to the parameter isImpose that the first derivative be equal to zero, and get

Therefore, the estimator is just the sample mean of the observations in the sample. This makes intuitive sense because the expected value of a Poisson random variable is equal to its parameter , and the sample mean is an unbiased estimator of the expected value.

The estimator is asymptotically normal with asymptotic mean equal to and asymptotic variance equal to

Proof

The score isThe Hessian isThe information equality implies thatwhere we have used the fact that the expected value of a Poisson random variable with parameter is equal to . Finally, the asymptotic variance is

Thus, the distribution of the maximum likelihood estimator can be approximated by a normal distribution with mean and variance .

The book

Most of the learning materials found on this website are now available in a traditional textbook format.

Featured pages

- Wald test
- Characteristic function
- Chi-square distribution
- Normal distribution
- Wishart distribution
- Multivariate normal distribution

Explore

Main sections

- Mathematical tools
- Fundamentals of probability
- Probability distributions
- Asymptotic theory
- Fundamentals of statistics
- Glossary

About

Glossary entries

- Continuous mapping theorem
- Power function
- Convolutions
- Loss function
- Almost sure
- Integrable variable

Share