Definition:Conditional Expectation

Definition
Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $X$ be a discrete random variable on $\struct {\Omega, \Sigma, \Pr}$.

Let $B$ be an event in $\struct {\Omega, \Sigma, \Pr}$ such that $\Pr \paren B > 0$.

The conditional expectation of $X$ given $B$ is written $\operatorname E \paren {X \mid B}$ and defined as:
 * $\displaystyle \operatorname E \paren {X \mid B} = \sum_{x \mathop \in \image X} x \Pr \paren {X = x \mid B}$

where:
 * $\Pr \paren {X = x \mid B}$ denotes the conditional probability that $X = x$ given $B$

whenever this sum converges absolutely.

Also see

 * Compare with expectation.