This lecture deals with maximum likelihood estimation of the parameters of the normal distribution. Before reading this lecture, you might want to revise the lecture entitled Maximum likelihood, which presents the basics of maximum likelihood estimation.
The mean and the variance are the two parameters that need to be estimated.
The likelihood function is
Given the assumption that the observations from the sample are IID, the likelihood function can be written as
The log-likelihood function is
By taking the natural logarithm of the likelihood function, we get
The maximum likelihood estimators of the mean and the variance are
We need to solve the following maximization problem The first order conditions for a maximum are The partial derivative of the log-likelihood with respect to the mean is which is equal to zero only ifTherefore, the first of the two first-order conditions implies The partial derivative of the log-likelihood with respect to the variance is which, if we rule out , is equal to zero only ifThus, the system of first order conditions is solved by
The vectoris asymptotically normal with asymptotic mean equal toand asymptotic covariance matrix equal to
The first entry of the score vector isThe second entry of the score vector isIn order to compute the Hessian we need to compute all second order partial derivatives. We haveandFinally, which, as you might want to check, is also equal to the other cross-partial derivative . Therefore, the Hessian isBy the information equality, we have thatAs a consequence, the asymptotic covariance matrix is
In other words, the distribution of the vector can be approximated by a multivariate normal distribution with mean and covariance matrix
Most of the learning materials found on this website are now available in a traditional textbook format.