Definition:Random Variable/Discrete

Definition
Let $\mathcal E$ be an experiment with a probability space $\left({\Omega, \Sigma, \Pr}\right)$.

A discrete random variable on $\left({\Omega, \Sigma, \Pr}\right)$ is a mapping $X: \Omega \to \R$ such that:
 * $(1): \quad$ The image of $X$ is a countable subset of $\R$
 * $(2): \quad$ $\forall x \in \R: \left\{{\omega \in \Omega: X \left({\omega}\right) = x}\right\} \in \Sigma$

Alternatively (and meaning exactly the same thing), the second condition can be written as:
 * $(2)': \quad$ $\forall x \in \R: X^{-1} \left({x}\right) \in \Sigma$

where $X^{-1} \left({x}\right)$ denotes the preimage of $x$.

The image $\operatorname{Im} \left({X}\right)$ of $X$ is often denoted $\Omega_X$.

Note that if $x \in \R$ is not the image of any elementary event $\omega$, then $X^{-1} \left({x}\right) = \varnothing$ and of course by definition of event space as a sigma-algebra, $\varnothing \in \Sigma$.

Note that a discrete random variable also fulfils the conditions for it to be a random variable.

Discussion
The function of condition $(2)$ in this context can be explained as follows:

Suppose $X$ is a discrete random variable. Then it takes values in $\R$.

We wish to ensure that for any $x \in \R$ it is possible to assign a probability that $X$ will take on the value $x$ in any given experiment $\mathcal E$.

To do this, we note that $X$ takes on the value $x$ for $\mathcal E$ iff the outcome of $\mathcal E$ lies in the subset of $\Omega$ which is mapped to $x$, namely $X^{-1}(x)$.

$(2)$ ensures that this subset is an element of $\Sigma$, and thus has a probability.