Definition:Probability Mass Function

From ProofWiki
Jump to navigation Jump to search

Definition

Let $\struct {\Omega, \Sigma, \Pr}$ be a probability space.

Let $X: \Omega \to \R$ be a discrete random variable on $\struct {\Omega, \Sigma, \Pr}$.


Then the (probability) mass function of $X$ is the (real-valued) function $p_X: \R \to \closedint 0 1$ defined as:

$\forall x \in \R: \map {p_X} x = \begin{cases} \map \Pr {\set {\omega \in \Omega: \map X \omega = x} } & : x \in \Omega_X \\ 0 & : x \notin \Omega_X \end{cases}$

where $\Omega_X$ is defined as $\Img X$, the image of $X$.

That is, $\map {p_X} x$ is the probability that the discrete random variable $X$ takes the value $x$.


$\map {p_X} x$ can also be written:

$\map \Pr {X = x}$


Note that for any discrete random variable $X$, the following applies:

\(\displaystyle \sum_{x \mathop \in \Omega_X} \map {p_X} x\) \(=\) \(\displaystyle \map \Pr {\bigcup_{x \mathop \in \Omega_X} \set {\omega \in \Omega: \map X \omega = x} }\) Definition of Probability Measure
\(\displaystyle \) \(=\) \(\displaystyle \map \Pr \Omega\)
\(\displaystyle \) \(=\) \(\displaystyle 1\)

The latter is usually written:

$\displaystyle \sum_{x \mathop \in \R} \map {p_X} x = 1$


Thus it can be seen by definition that a probability mass function is an example of a normalized weight function.


The set of probability mass functions on a finite set $Z$ can be seen denoted $\map \Delta Z$.


Joint Probability Mass Function

Let $X: \Omega \to \R$ and $Y: \Omega \to \R$ both be discrete random variables on $\struct {\Omega, \Sigma, \Pr}$.


Then the joint (probability) mass function of $X$ and $Y$ is the (real-valued) function $p_{X, Y}: \R^2 \to \closedint 0 1$ defined as:

$\forall \tuple {x, y} \in \R^2: \map {p_{X, Y} } {x, y} = \begin {cases} \map \Pr {\set {\omega \in \Omega: \map X \omega = x \land \map Y \omega = y} } & : x \in \Omega_X \text { and } y \in \Omega_Y \\ 0 & : \text {otherwise} \end {cases}$

That is, $\map {p_{X, Y} } {x, y}$ is the probability that the discrete random variable $X$ takes the value $x$ at the same time that the discrete random variable $Y$ takes the value $y$.


General Definition

Let $X = \left\{{X_1, X_2, \ldots, X_n}\right\}$ be a set of discrete random variables on $\left({\Omega, \Sigma, \Pr}\right)$.

Then the joint (probability) mass function of $X$ is (real-valued) function $p_X: \R^n \to \left[{0 \,.\,.\, 1}\right]$ defined as:

$\forall x = \left({x_1, x_2, \ldots, x_n}\right) \in \R^n: p_X \left({x}\right) = \Pr \left({X_1 = x_1, X_2 = x_2, \ldots, X_n = x_n}\right)$

The properties of the two-element case can be appropriately applied.


Also known as

A (probability) mass function is often seen abbreviated p.m.f., pmf or PMF.


Sources