Definition:Conditional Expectation

Let $$\left({\Omega, \Sigma, \Pr}\right)$$ be a probability space.

Let $$X$$ be a discrete random variable on $$\left({\Omega, \Sigma, \Pr}\right)$$.

Let $$B$$ be an event in $$\left({\Omega, \Sigma, \Pr}\right)$$ such that $$Pr \left({B}\right) > 0$$.

The conditional expectation of $$X$$ given $$B$$ is written $$E \left({X | B}\right)$$ and defined as:
 * $$E \left({X | B}\right) = \sum_{x \in \operatorname{Im} \left({X}\right)} x \Pr \left({X = x | B}\right)$$

whenever this sum converges absolutely.

Compare with expectation.