The Wald test is a test of hypothesis usually performed on parameters that have been estimated by maximum likelihood.
Before reading this lecture, the reader is strongly advised to read the lecture entitled Maximum likelihood - Hypothesis testing, which introduces the basics of hypothesis testing in a maximum likelihood (ML) framework.
In what follows, we are going to assume that an unknown parameter
has been estimated by ML, that it belongs to a parameter space
,
and that we want to test the null
hypothesis
where
is a vector valued function, with
.
The aforementioned lecture on hypothesis testing gives some examples of common
null hypotheses that can be written in this form.
We are going to assume that the following technical conditions are satisfied:
for each
,
the entries of
are continuously differentiable with respect to all entries of
;
the
matrix of the partial derivatives of the entries of
with respect to the entries of
,
called the Jacobian of
and denoted by
,
has rank
.
Let
be the estimate of the
parameter
obtained by maximizing the log-likelihood over the whole parameter space
:
where
is the likelihood function and
is the sample.
Let us assume that the sample and the likelihood function satisfy some set of
conditions that are sufficient to guarantee consistency and asymptotic
normality of
(see the lecture on maximum likelihood
for a set of such conditions).
The test statistic employed in the Wald test
iswhere
is the sample size, and
is a consistent estimate of the asymptotic covariance matrix of
(see the lecture entitled
Maximum
likelihood - Covariance matrix estimation).
Asymptotically, the test statistic has a Chi-square distribution, as stated by the following proposition.
Proposition
Under the null hypothesis that
,
the Wald statistic
converges in distribution to a
Chi-square distribution with
degrees of freedom.
We have assumed that
is consistent and asymptotically normal, which implies that
converges in distribution to a multivariate normal
random variable with mean
and asymptotic covariance matrix
,
that
is,
Now,
by the delta method, we have
that
But
,
so
that
We
have assumed that
and
are consistent estimators, that
is,
where
denotes convergence in probability. Therefore, by
the continuous mapping
theorem, we have
that
Thus
we can write the Wald statistic as a sequence of quadratic forms
where
converges
in distribution to a normal random vector
with mean zero, and
converges
in probability to
.
By a standard result (see Exercise 2 in the lecture on
Slutsky's theorem), such a
sequence of quadratic forms converges in distribution to a Chi-square random
variable with a number of degrees of freedom equal to
.
In the Wald test, the null hypothesis is rejected
ifwhere
is a pre-determined critical value.
The size of the test can be
approximated by its asymptotic
value
where
is the distribution
function of a Chi-square random variable with
degrees of freedom.
The critical value
can be chosen so as to achieve a pre-determined size, as
follows:
The following example shows how to use the Wald test to test a simple linear restriction.
Example
Let the parameter space be the set of all
-dimensional
vectors, i.e.,
.
Suppose we have obtained the following estimates of the parameter and of the
asymptotic covariance
matrix:
where
is the sample size. We want to test the restriction
where
and
denote the first and second component of
.
Then, the function
is a function
defined
by
In
this case,
.
The Jacobian of
is
which
has rank
.
Note also that it does not depend on
.
We have
that
As
a consequence, the Wald statistic
is
Our
test statistic has a Chi-square distribution with
degrees of freedom. Suppose we want our test to have size
.
Then, our critical value
is
where
is the distribution function of a Chi-square random variable with
degree of freedom and the value of
can be calculated with any statistical software (for, example, in MATLAB with
the command
chi2inv(0.95,1)
). Therefore, the test
statistic does not exceed the critical
valueand
we do not reject the null hypothesis.
Please cite as:
Taboga, Marco (2021). "Wald test", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/fundamentals-of-statistics/Wald-test.
Most of the learning materials found on this website are now available in a traditional textbook format.