Definition:Random Sample (Probability Theory)

From ProofWiki
Jump to navigation Jump to search

Definition

Let $X_i$ be a random variable with $\Img {X_i} = \Omega$, for all $1 \le i \le n$.

Let $F_i$ be the cumulative distribution function of $X_i$ for all $1 \le i \le n$.


We say that $X_1, X_2, \ldots, X_n$ form a random sample of size $n$ if:

$X_i$ and $X_j$ are independent if $i \ne j$
$\map {F_1} x = \map {F_i} x$ for all $x \in \Omega$

for all $1 \le i, j \le n$.


Continuous Distribution

Let $X$ be a continuous random variable with a known probability distribution.

Let $\map f X$ be the frequency function of $X$.


A random sample of values of $X$ is obtained by selecting values $s$ such that:

$\map \Pr {s \in \openint x {x + \delta x} } = \map f x \delta x$

That is, the probability that $s$ is in a given interval is proportional to the length of the interval.


Discrete Distribution

Let $X$ be a discrete random variable with a known probability distribution.

Let $\map f X$ be the frequency function of $X$.


A random sample of values of $X$ is obtained by selecting values $s$ such that:

$\map \Pr {s = x_i} = \map f {x_i}$

That is, the probability that $s$ is a given value is determined completely by the frequency function of $X$ at that value.


Methods of Generation

Tables of random samples from certain probability distributions are available commercially.

However, it is usual nowadays for simulations or Monte Carlo models to use a computer to generate effectively random samples.


Also known as

If $X_1, X_2, \ldots, X_n$ form a random sample, they are said to be independent and identically distributed, commonly abbreviated i.i.d.


Also see


Sources