Definition:Conditional Expectation

Definition
Let $\left({\Omega, \Sigma, \Pr}\right)$ be a probability space.

Let $X$ be a discrete random variable on $\left({\Omega, \Sigma, \Pr}\right)$.

Let $B$ be an event in $\left({\Omega, \Sigma, \Pr}\right)$ such that $\Pr \left({B}\right) > 0$.

The conditional expectation of $X$ given $B$ is written $E \left({X \mid B}\right)$ and defined as:
 * $\displaystyle E \left({X \mid B}\right) = \sum_{x \mathop \in \operatorname{Im} \left({X}\right)} x \Pr \left({X = x \mid B}\right)$

whenever this sum converges absolutely.

Note that $\Pr \left({X = x \mid B}\right)$ denotes the conditional probability that $X = x$ given $B$.

Also see
Compare with expectation.