Assume we have data from a sequence of r.v’s $X_1, \dots , X_n$, where we build an estimator $\hat \Theta_n$ that has a known asymptotic variance $\sigma^2$. Via the Delta method we can approximate the asymptotic variance of the function of that estimator $g(\hat \Theta_n)$.
**Assumptions:**
- The function $g$ needs to be continuously differentiable and not a constant.
- The original estimator $\hat \Theta_n$ needs to be asymptotically normal.
Let us choose the the sample average $\bar X_n$ for the estimator. For this estimator we know that it is [[Properties of an Estimator#Key Properties of an Estimator|asymptotically normal]], with variance $\sigma^2$.
$ \sqrt n(\bar X_n-\theta) \xrightarrow[]{(d)}\mathcal N(0, \sigma^2) $
We could now use the [[Combining Limits#Continuous Mapping Theorem|Continuous Mapping Theorem]] to wrap both sides of the convergence statement with $g$.
- When $g$ is a linear function than, then we know that $g(\mathcal N)$ is just an [[Gaussian Distribution#Properties of the Gaussian Distribution|affine transformation]].
- When $g$ is non-linear, then it is not clear, what the limit distribution of $g(\mathcal N)$ looks like. In this case we need the Delta method.
$ g\left(\sqrt n(\bar X_n-\theta)\right) \xrightarrow[]{(d)}g\left(\mathcal N(0, \sigma^2)\right) $
## Derivation
The Delta method approximates the function $g(\bar X_n)$ via a first-order [[Taylor Series]]. This is done by evaluating the function at another point $\theta$ and adding its slope $g^\prime(\theta)$ times the distance between the two points $\bar X_n$ and $\theta$.
![[delta-method.png|center|350]]
>[!note:]
>Point $\bar X_n$ needs be close to $\theta$ so that the linear approximation is still precise enough. However since our estimator $\bar X_n$ is consistent, we know that $\bar X_n \xrightarrow[]{\mathbf P} \theta$.
$
\begin{align}
g(\bar X_n) &\simeq g(\theta) + g^\prime(\theta) *(\bar X_n-\theta) \\[6pt]
g(X_n) - g(\theta) &\simeq g^\prime(\theta) * (\bar X_n - \theta) \\[6pt]
\sqrt n *\Big(g(\bar X_n) - g(\theta)\Big) &\simeq g^\prime(\theta) * (\bar X_n - \theta) * \sqrt n \\[6pt]
&\simeq g'(\theta)* \mathcal N(0, \sigma^2) \\[8pt]
&\simeq \mathcal N(0, g^\prime(\theta)^2*\sigma^2) \end{align}
$
where:
- (3) Adding $\sqrt n$ on both sides.
- (4) We know that $(\bar X_n- \theta)*\sqrt n$ converges to a Gaussian.
- (5) Since $g^\prime(\theta)$ is a constant, it just scales the Gaussian. Since $\mathcal N$ is centered around $0$, the mean stays unaffected, while the variance gets squared by the factor.
**Delta Method Formula:**
$ \sqrt n *\Big(g(\bar X_n) - g(\theta)\Big) \simeq \mathcal N(0, g^\prime(\theta)^2*\sigma^2) $