We can break down the MSE into two components, namely the [[Variance]] and the [[Properties of an Estimator#Key Properties of an Estimator|Bias]].
$
\begin{align}
\mathrm{Var}(Z) &= \mathbb{E}[Z^2] + (\mathbb{E}[Z])^2 \tag{1}\\[8pt]
\mathbb{E}[Z^2] &= \mathrm{Var}(Z)-(\mathbb{E}[Z])^2 \tag{2}\\[6pt]
\mathbb{E}\big[(\hat \Theta - \theta)^2 \big] &= \mathrm{Var}(\hat \Theta - \theta)+ \big(\mathbb{E}[\hat \Theta - \theta]\big)^2 \tag{3}\\[6pt]
\mathrm{MSE}(\hat \Theta) &=\mathrm{Var}(\hat \Theta) + (\text{bias})^2 \tag{4}
\end{align}
$
where:
- (1) [[Variance#^559043|Variance]] expressed in the method of moments form.
- (3) Express $Z$ as the deviation of the estimator $\hat \Theta$ to the true unknown.
- (4) Since $\theta$ is a constant, it does not affect the variance. The last term can be interpreted as the bias, since the expectation of the deviation quantifies if the estimator $\hat \Theta$ is consistently above or below the true $\theta$.
**Comparing Estimators:**
| | MSE | Comment |
| -------------- | ------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------- |
| Sample Mean | $\mathrm{MSE}(\hat \Theta)=\frac{\sigma^2}{n}+ 0$ | The sample mean as a certain variance, but $0$ bias, as it is an unbiased estimator. |
| Zero Estimator | $\mathrm{MSE}(\hat\Theta)=0+\theta^2$ | The zero-estimator always estimates a $0$. Thereby it has no variance, but a bias that is equal to the true $\theta$. |
![[mean-squared-error.png|center|400]]
The bigger $n$ gets, the lower is the turquoise MSE line.
**Standard error:** By taking the $\sqrt{\mathrm{Var}(\hat \Theta)}$ we get a more interpretable value for the variance of the estimator. It indicates by how much my estimator (in this case the sample mean) is off from the true $\theta$ on “average”.