Decomposition of Mean Squared Error

Theorem
Let $\theta$ be a population parameter of some statistical model.

Let $\hat \theta$ be an estimator of $\theta$.

We then have:


 * $ \map{\operatorname{MSE}} {\hat \theta} = \var {\hat \theta} + \paren {\map{\operatorname{bias}} {\hat \theta} }^2 $

where:


 * $\map{\operatorname{MSE}} {\hat \theta}$ denotes the Mean Squared Error of $\hat \theta$.


 * $\var {\hat \theta}$ denotes the variance of $\hat \theta$.


 * $\map{\operatorname{bias}} {\hat \theta}$ denotes the bias of $\hat \theta$.

Proof
Let $\delta = \hat \theta - \theta$.

By :


 * $ \expect {\delta ^2} = \map{\operatorname{MSE}} {\hat \theta}$

and:

Therefore:

Or, equivalently:


 * $\map{\operatorname{MSE} } {\hat \theta} = \var {\hat \theta} + \paren {\map{\operatorname{bias} } {\hat \theta} }^2$