Search for probability and statistics terms on Statlect
StatLect

Maximum likelihood - Hypothesis testing

by , PhD

This lecture discusses how to perform tests of hypotheses on parameters that have been estimated by Maximum likelihood.

Table of Contents

Testing framework

We are going to assume that an unknown parameter $	heta _{0}$ has been estimated by maximum likelihood methods, that the parameter belongs to a parameter space [eq1], and that we want to test the null hypothesis[eq2]where $Theta _{R}$ is a proper subset of $Theta $, i.e.,[eq3]

Three popular tests

There are three popular methods to carry out tests of this kind of restriction:

Thus, the parameter estimate used to carry out the test is the main aspect in which these tests differ among each other: an unrestricted estimate for the Wald test, a restricted estimate for the score test, and both estimates for the likelihood ratio test.

The following sections will cover the main aspects of these tests and will refer the reader to other sections that contain more detailed explanations.

Assumptions

In the remainder of this lecture it will be assumed that the sample $xi _{n} $ is a sample of observations from an IID sequence and that the log-likelihood function satisfies all the conditions used in previous lectures (see Maximum likelihood) to derive the asymptotic distribution of the maximum likelihood estimator.

It will also be assumed that the restriction being tested can be written as[eq15]where [eq16] is a vector valued function, $rleq p$, all the entries of $g$ are continuously differentiable with respect to its arguments, and the Jacobian [eq17], i.e., the $r	imes p$ matrix of partial derivatives of the entries of $g$ with respect to the entries of $	heta $, has rank $r$.

Examples of restrictions being tested

As the above definition of a restriction may seem a little bit abstract, we provide some examples below.

Example Let the parameter space be the set of all $2$-dimensional vectors, i.e., [eq18]. Suppose we want to test the restriction $	heta _{0,2}=1$, where $	heta _{0,2}$ denotes the second component of $	heta _{0}$. Then, the function [eq19] is a function [eq20] defined by[eq21]In this case, $r=1$. The Jacobian of $g$ is[eq22]which has obviously rank $r=1$.

Example Let the parameter space be the set of all $3$-dimensional vectors, i.e., [eq23]. Suppose we want to test the restrictions [eq24]where $	heta _{0,j}$ denotes the $j$-th component of $	heta _{0}$. Then, the function [eq25] is a function [eq26] defined by[eq27]In this case, $r=2$. The Jacobian of $g$ is[eq28]which has rank $r=2$ because its two rows are linearly independent.

Wald test

Let [eq29] be the estimate of a $p	imes 1$ parameter $	heta _{0}$, obtained by maximizing the log-likelihood over the whole parameter space $Theta $:[eq30]The Wald test is based on the following test statistic:[eq31]where n is the sample size and $widehat{V}_{n}$ is a consistent estimate of the asymptotic covariance matrix of [eq32] (see the lecture entitled Maximum likelihood - Covariance matrix estimation).

Under the null hypothesis, that is, under the hypothesis that [eq33], the Wald statistic $W_{n}$ converges in distribution to a Chi-square distribution with $r$ degrees of freedom.

The test is performed by fixing a critical value $z$ and by rejecting the null hypothesis if[eq34]

The size of the test can be approximated by its asymptotic value[eq35]

where $Fleft( z
ight) $ is the distribution function of a Chi-square random variable with n degrees of freedom. We can choose $z$ so as to achieve a pre-determined size, as follows:[eq36]

More details about the Wald test, including a detailed derivation of its asymptotic distribution, can be found in the lecture entitled Wald test.

Score test

Let [eq37] be the estimate of a $p	imes 1$ parameter $	heta _{0}$ , obtained by maximizing the log-likelihood over the restricted parameter space $Theta _{R}$:[eq38]The score test (also called Lagrange multiplier test) is based on the following test statistic:[eq39]where n is the sample size, $widehat{V}_{n}$ is a consistent estimate of the asymptotic covariance matrix of [eq40], and [eq41]is the score, that is, the gradient of the log-likelihood function.

Under the null hypothesis that [eq42], the score statistic $LM_{n}$ converges in distribution to a Chi-square distribution with $r$ degrees of freedom.

Once the test statistic has been computed, the test is carried out following the same procedure described above for the Wald test.

More details about the score test, including a detailed derivation of its asymptotic distribution, can be found in the lecture entitled score test.

Likelihood ratio test

Let [eq43] be the unrestricted estimate of a $p	imes 1$ parameter $	heta _{0}$ obtained by solving[eq44]and [eq45] the restricted estimate obtained by solving[eq46]The likelihood ratio test is based on the following test statistic:[eq47]In other words, the test statistic is equal to two times the difference between the log-likelihood corresponding to the unrestricted estimate [eq48] and the log-likelihood corresponding to the restricted estimate [eq49].

Under the null hypothesis that [eq50], the score statistic $LR_{n}$ converges in distribution to a Chi-square distribution with $r$ degrees of freedom.

Once the test statistic has been computed, the test is carried out following the same procedure described above for the Wald test.

More details about the likelihood ratio test, including a detailed derivation of its asymptotic distribution, can be found in the lecture entitled likelihood ratio test.

How to cite

Please cite as:

Taboga, Marco (2021). "Maximum likelihood - Hypothesis testing", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-statistics/maximum-likelihood-hypothesis-testing.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.