Definition:Expectation/Continuous

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X$ be a continuous random variable over the probability space $\struct {\Omega, \Sigma, \Pr}$.

Let $F = \map \Pr {X < x}$ be the cumulative probability function of $X$.


The expectation of $X$ is written $\expect X$, and is defined over the probability measure as:

$\expect X := \displaystyle \int_{x \mathop \in \Omega} x \rd F$

whenever the integral is absolutely convergent, i.e. when:

$\displaystyle \int_{x \mathop \in \Omega} \size x \rd F < \infty$



Also, from the definition of probability density function $f_X$ of $X$, we see it can also be written over the sample space:

$\displaystyle \expect X := \int_{x \mathop \in \Omega_X} x \, \map {f_X} x \rd x$


Also known as

The expectation of a random variable $X$ is also called the expected value of $X$ or the mean of $X$, and (for a given random variable) is often denoted $\mu$.

The terminology is appropriate, as it can be seen that an expectation is an example of a normalized weighted mean.

This follows from the fact that a probability mass function is a normalized weight function.


Various forms of $E$ can be seen to denote expectation:

$\map E X$
$\map {\operatorname E} X$
$\mathop {\mathbb E} \sqbrk X$

and so on.


Also see

It can also be seen that the expectation of a continuous random variable is its first moment.


Technical Note

The $\LaTeX$ code for \(\expect {X}\) is \expect {X} .

When the argument is a single character, it is usual to omit the braces:

\expect X


Sources