Search for probability and statistics terms on Statlect
StatLect

Delta method

by , PhD

The Delta method is a theorem that can be used to derive the distribution of a function of an asymptotically normal variable.

It is often used to derive standard errors and confidence intervals for functions of parameters whose estimators are asymptotically normal.

Table of Contents

The starting point: an asymptotically normal sequence

Let [eq1] be a sequence of random variables such that[eq2]where:

[eq5] is said to be asymptotically normal, $	heta _{0}$ is called the asymptotic mean of [eq5] and V its asymptotic variance.

For example, [eq1] could be a sequence of sample means that are asymptotically normal because a Central Limit Theorem applies. Or it could be a sequence of maximum likelihood estimators satisfying a set of conditions that are sufficient for asymptotic normality.

The transformation

Now, consider the sequence [eq8] where $g$ is a function.

The delta method is a method that allows us to derive, under appropriate conditions, the asymptotic distribution of [eq9] from the asymptotic distribution of [eq5].

The proposition

A formal statement of the delta method is given in the following proposition.

Proposition Let [eq1] be a sequence of random variables such that[eq12]Let [eq13] be a continuously differentiable function. Then, [eq14]

Proof

By the mean value theorem, there exists a point $overline{	heta }$ lying between [eq15] and $	heta _{0}$, such that[eq16]By subtracting [eq17] from both sides and multiplying by $sqrt{n}$, we obtain[eq18]Since [eq5] converges in probability to $	heta _{0}$, and $overline{	heta }$ lies between [eq20] and $	heta _{0}$, also $overline{	heta }$ converges in probability to $	heta _{0}$. Because the derivative $dg/d	heta $ is continuous, by the continuous mapping theorem[eq21]where [eq22] denotes convergence in probability. Therefore, the first term of the product [eq23]converges in probability to a constant. By assumption, the second term converges in distribution to a normal random variable Z having mean 0 and variance V. As a consequence, Slutsky's theorem applies and the product converges in distribution to[eq24]By elementary rules on linear transformations of normal random variables, this has a normal distribution with mean 0 and variance[eq25]

Example

In this example we show how the delta method can be applied.

Suppose that a sequence [eq1] is asymptotically normal with asymptotic mean $	heta _{0}=1$ and asymptotic variance $V=1$, that is,[eq27]

We want to derive the asymptotic distribution of the sequence [eq28].

The function[eq29]is continuously differentiable, so we can apply the delta method.

The asymptotic mean of the transformed sequence is[eq30]

In order to compute the asymptotic variance, we need to take the first derivative of the function [eq31], which is[eq32]and evaluate it at $	heta _{0}=1$:[eq33]

Therefore, the asymptotic variance is[eq34]and we can write[eq35]

Multivariate generalization

The delta method generalizes also to multivariate settings, as stated by the following proposition.

Proposition Let [eq1] be a sequence of Kx1 random vectors such that[eq12]where [eq3] is a multivariate normal distribution with mean 0 and covariance matrix V, $	heta _{0}$ is a constant Kx1 vector, and [eq39] indicates convergence in distribution. Let [eq40]. If all the $L$ entries of $g$ have continuous partial derivatives with respect to $	heta $, then [eq41]where [eq42] is the Jacobian of $g$, i.e., the $L	imes K$ matrix of partial derivatives of the entries of $g$ with respect to the entries of $	heta $.

The next example shows how the multivariate delta method can be applied.

Example Suppose that a sequence of $2	imes 1$ random vectors [eq43] satisfies[eq12]where the asymptotic mean is[eq45]and the asymptotic covariance matrix is[eq46]Denote the two components of [eq5] by [eq48] and [eq49]. We want to derive the asymptotic distribution of the sequence [eq50]. The function[eq51]is continuously differentiable, so we can apply the delta method. The asymptotic mean of the transformed sequence is[eq52]In order to compute the asymptotic covariance matrix, we need to compute the Jacobian of the function [eq31], which is[eq54]and evaluate it at $	heta _{0}$:[eq55]Therefore, the asymptotic variance is[eq56]and we can write[eq57]

Solved exercises

Below you can find some exercises with explained solutions.

Exercise 1

Let [eq1] be an asymptotically normal sequence with asymptotic mean $	heta _{0}=0$ and asymptotic variance $V=1$, that is,[eq59]

Derive the asymptotic distribution of the sequence [eq60].

Solution

The function[eq61]is continuously differentiable, so we can apply the delta method. The asymptotic mean of the transformed sequence is[eq62]In order to compute the asymptotic variance, we need to take the first derivative of the function [eq31], which is[eq64]and evaluate it at $	heta _{0}=0$:[eq65]Therefore, the asymptotic variance is[eq66]and we can write[eq67]

Exercise 2

Let [eq1] be a sequence of $2	imes 1$ random vectors satisfying[eq2]where the asymptotic mean is[eq70]and the asymptotic covariance matrix is[eq71]

Denote the two entries of [eq5] by [eq73] and [eq49].

Derive the asymptotic distribution of the sequence of products [eq75]

Solution

We can apply the delta method because the function[eq76]is continuously differentiable. The asymptotic mean of the transformed sequence is[eq77]The Jacobian of the function [eq31] is[eq79]By evaluating it at[eq70]we obtain[eq81]Therefore, the asymptotic covariance matrix is[eq82]and we can write[eq83]

Exercise 3

Let [eq1] be a sequence of $2	imes 1$ random vectors satisfying[eq2]where the asymptotic mean is[eq86]and the asymptotic covariance matrix is[eq87]

Denote the two entries of [eq5] by [eq73] and [eq49].

Derive the asymptotic distribution of the sequence of $2	imes 1$ vectors [eq91] where the two entries of [eq92] satisfy[eq93]

Solution

We can apply the delta method because the functions[eq94]are continuously differentiable. The asymptotic mean of the transformed sequence is a $2	imes 1$ vector $eta _{0}$ whose entries are [eq95]The Jacobian of the function [eq31] is[eq97]By evaluating it at[eq86]we obtain[eq99]As a consequence, the asymptotic covariance matrix is[eq100]Thus,[eq101]where $eta _{0}$ and $V_{eta }$ have been calculated above.

How to cite

Please cite as:

Taboga, Marco (2021). "Delta method", Lectures on probability theory and mathematical statistics. Kindle Direct Publishing. Online appendix. https://www.statlect.com/asymptotic-theory/delta-method.

The books

Most of the learning materials found on this website are now available in a traditional textbook format.