Definition talk:Expectation

I have mostly encountered $\mathrm E$, $\mathbb E$ or $\mathbf E$ for $E$, and I like them better. Thoughts? --Lord_Farin (talk) 09:36, 16 October 2012 (UTC)


 * All my notes use $E$. It's easier. Please don't use $\mathbb E$, I would prefer if bb was reserved for sets of numbers. --Jshflynn (talk) 12:43, 16 October 2012 (UTC)


 * While $E$ is easier, it obfuscates the distinction between variables and operators, always a dangerous thing to do IMO. --Lord_Farin (talk) 12:50, 16 October 2012 (UTC)


 * I am yet to encounter $E$ being used as a variable (at least off the top of my head). I vote bf should you wish to carry this out. It has also occurred to me lately that the theorem section of many theorem pages contain sentences such as "where blah denotes the blah of blah". This is obviously primarily because symbols exceed concepts in count but I wonder whether or not it would be a good idea on the new German ProofWiki to have a "notation setup" section within each theorem page so that the theorem statement is clearer. Thoughts? --Jshflynn (talk) 13:22, 16 October 2012 (UTC)


 * Good call. I checked my source work and it actually has $\mathsf E$, which would work for me. Completely agree with Jshf about bb, and I don't like bf as that sounds like a vector / matrix to me. Similarly $\mathsf {VAR}$ would be used for variance. --prime mover (talk) 19:44, 16 October 2012 (UTC)


 * The mathsf font is fine with me as well. --Lord_Farin (talk) 20:09, 16 October 2012 (UTC)


 * It took long enough, but we now have the custom $\LaTeX$ command  which expands to $\expect X$, using the mathsf font as required. Similar for variance. --prime mover (talk) 12:09, 12 February 2020 (EST)

It seems like this page is missing something. Let's say, for example, we have $X \sim \operatorname U[0 \,.\,.\, 1]$, and a function $f$ as follows: $$\displaystyle f(x) = \begin{cases} x & x \lt \frac 1 2 \\ 3/4 & x \ge \frac 1 2 \end{cases}$$ Now, the distribution $f(X)$ is neither discrete nor continuous, but intuitively it does have an expectation of $\frac 1 2$. Plokmijnuhby (talk)


 * And so exactly why should that missing thing be on this page in particular? --prime mover (talk) 09:58, 9 March 2019 (EST)


 * This particular distribution was meant as an example of a general case, to show that there are many distributions that aren't continuous or discrete. The page gives no way to calculate the expectation for them, which I believe it should. --Plokmijnuhby (talk) 13:02, 9 March 2019 (EST)


 * I've had an idea how to do this.

"Let $X$ be a random variable.

Let $X_1,X_2,\ldots$ be a sequence of independent variables, with the same distribution as $X$.

Let:
 * $\displaystyle S_n = \sum_{i \mathop = 1}^n X_i$

Now a real number $\mu$ is the expectation of $X$ if and only if:
 * $\displaystyle \frac {S_n} n \xrightarrow D \mu$ as $n \to \infty$

that is, converges in distribution to a variable that always takes the value $\mu$."
 * This would also have the effect of joining the two existing definitions into one. It's likely that some pages which link to this one would need adjustment, though. Plokmijnuhby (talk) 15:58, 9 March 2019 (EST)
 * On second thought, maybe "converges in distribution" is too hand-wavy. Here's an improvement to the last section:

Now a real number $\mu$ is the expectation of $X$ if and only if:
 * $\displaystyle \forall \epsilon \in \R : \epsilon > 0 : \lim_{n \to \infty} \Pr\left(\left| \frac {S_n} n - \mu \right| > \epsilon \right) = 0$
 * Plokmijnuhby (talk) 17:35, 9 March 2019 (EST)


 * This particular line of thought could be very good. There has been some work in establishing the foundations of probability theory, but admittedly this page has not been included.


 * If such a rewrite/generalisation is undertaken, we should take care that the "simple" cases of discrete and continuous RV remain available.


 * To ensure the validity of the material, it would also be helpful to have one or more source works that use this definition of expectation. Bonus points for proofs that the "simple" cases arise by applying the definition. &mdash; Lord_Farin (talk) 04:52, 10 March 2019 (EDT)

Measure Theoretic Definition
I think I can fix this page. My idea is to limit the continuous rv definition to the case where the rv is Riemann integrable and then add a third definition for full generality using Folland's definition. Is that a reasonable approach? What should I name the measure theoretic definition?

If $X$ is a $\Sigma$-measurable function in $\struct {\Omega, \Sigma, \Pr}$

$\displaystyle \expect X := \int_\Omega X \rd\!\Pr$

--GFauxPas (talk) 21:40, 11 February 2020 (EST)


 * You mean this page: Definition:Expectation/Continuous? --prime mover (talk) 04:31, 12 February 2020 (EST)


 * Yes. That uses notation that I don't know how to fix, as called out by the {explain} template. (I deleted a comment about Lebesgue integration, ignore what I said about that, it's not enough for all cases). --GFauxPas (talk) 08:53, 12 February 2020 (EST)


 * I don't know enough about this area of maths to be able to make a definitive statement, but it appears that as a probability distribution is just a measure with some further structure, we could do worse than use the language of measure theory. I have a few books on the subject but have never had the patience to concentrate on them for long enough to get to the meat of what it's all about, it takes too long to fumble through the preliminaries. One day, though. --prime mover (talk) 12:13, 12 February 2020 (EST)


 * Indeed a probability distribution is a particular type of measure. The "correct" definition is the measure theoretical one, but for pedagogical reasons there's an advantage to separately defining the case where the reader only needs to know Riemann integration. --GFauxPas (talk) 15:09, 12 February 2020 (EST)