In this lecture, we derive the maximum likelihood estimator of the parameter of an exponential distribution. The theory needed to understand this lecture is explained in the lecture entitled Maximum likelihood.
We observe the first terms of an IID sequence of random variables having an exponential distribution. A generic term of the sequence has probability density functionwhere is the support of the distribution and the rate parameter is the parameter that needs to be estimated.
We assume that the regularity conditions needed for the consistency and asymptotic normality of maximum likelihood estimators are satisfied.
The likelihood function is
Since the terms of the sequence are independent, the likelihood function is equal to the product of their densities:Because the observed values can only belong to the support of the distribution, we can write
The log-likelihood function is
This is obtained by taking the natural logarithm of the likelihood function:
The maximum likelihood estimator of is
The estimator is obtained as a solution of the maximization problem The first order condition for a maximum is The derivative of the log-likelihood isBy setting it equal to zero, we obtainNote that the division by is legitimate because exponentially distributed random variables can take on only positive values (and strictly so with probability 1).
Therefore, the estimator is just the reciprocal of the sample mean
The estimator is asymptotically normal with asymptotic mean equal to and asymptotic variance equal to
The score isThe Hessian isBy the information equality, we have thatFinally, the asymptotic variance is
This means that the distribution of the maximum likelihood estimator can be approximated by a normal distribution with mean and variance .
Most of the learning materials found on this website are now available in a traditional textbook format.