Let $X$, $Y$ be two random variables, $E[ X | Y = y]$ is a function of $y$, $E[ X|Y ] $ is a function of $Y$. Its distribution is determined by the distribution of $Y$.
This is essentially a reformulation of the total expectation theorem.
Since $E[ X | Y ]$ is a random variable, it has an expectation $E[ E[ X|Y ]] $ of its own, which can be calculated using the expected value rule:
$$E[ E[ X | Y ]] = \begin{cases} \sum_{ y } E[ X | Y = y ] p_Y(y), & Y\text{ discrete} \br \int_{ -\infty }^{ +\infty } E[ X | Y = y ] f_Y(y) \d y, &Y\text{ continuous} \end{cases}$$
which leads us to the law of iterated expectations
$$E[ E[ X | Y ]] = E[ X ] $$
Let $X$ and $Y$ be two random variables. For any function $g$, we have
$$E[ X g(Y) | Y ] = g(Y) E[ X | Y ]$$
Let $X$, $Y$ be two random variables.
$$\hat X = E[X|Y] $$
is called an estimator of $X$ given $Y$. As noted before, the estimator is a random variable function over $Y$.
The estimation error is a random variable defined as
$$\tilde X = \hat X - X $$
$\tilde X$ is a random variable satisfying
$$E[ \tilde X | Y ] = 0$$
That is,
$$\forall y, E[ \tilde X | Y = y ] = 0 $$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
Using the law of iterated expectations, we also have
$$E[ \tilde X ] = E[ E[ \tilde X | Y ]] = 0 $$
The estimation error is uncorrelated with the estimation error $\hat X$.
$$E[ \hat X \tilde X ] = 0 $$
$$\cov{ \hat X} { \tilde X } = 0$$
It follows that
$$\var{ X } = \var{ \tilde X } + \var{ \hat X } $$
</span>
</span>
<span class="proof__expand"><a>[expand]</a></span>
It follows that
We introduce the random variable
$$\var{X|Y} = E [ (X - E [ X|Y ])^2 | Y ] = E [ \tilde X^2 | Y ] $$
This is a function of $Y$ whose value is the conditional variance of $X$ when $Y$ takes the value $y$:
$$\var{ X|Y = y } = E [ \tilde X^2 | Y = y ] $$
Using the fact $E [ \tilde X ] = 0 $ and the law of iterated expectations, we can write the variance of estimation error as
$$\var{ \tilde X } = E [ \tilde X ^2 ] = E [ E [ \tilde X^2 | Y ] ] = E [ \var{ X|Y } ] $$
and rewrite the equation $\var{ X } = \var{ \tilde X } + \var{ \hat X } $ as follows.
Let $X$ and $Y$ be two random variables, we have
$$\var{X} = E [ \var{X|Y} ] + \var{ E [X|Y] } $$