Definition talk:Entropy of Finite Partition

From ProofWiki
Jump to navigation Jump to search

Yes, I was aware of Definition:Entropy (Probability Theory) and Definition:Uncertainty but had not better idea. Perhaps it is not bad to have separate pages for similar concepts unless they are identical.

That's not how we do it on $\mathsf{Pr} \infty \mathsf{fWiki}$.

But if you can merge them meaningfully into one page, feel free to do this. Important for me is that the base is not fixed to be $2$. Although the choice of the base isn't a real restriction in entropy theory, choosing $2$ as base implies that we are thinking about bits in information theory. In dynamical system it is more common to choose $e$, since we also study symbolic shits with more than two symbols, like dyadic, $n$-adic .. So taking $2$ for the base of entropy is painful for me.--Usagiop (talk) 18:28, 10 June 2022 (UTC)

I'm not doing it, for the simple fact that I know enough about this subject to know that I don't know enough about it to know what I'm doing.
On the other hand:
a) Having two different definitions for the same thing is not an option here;
b) Merely leaving a statement on a talk page saying that the base of the logarithm doesn't really matter is also suboptimal. It is essential to implement a page stating and proving it.
So this definitely needs to be attended to at some stage.
It is of course appreciated that applied mathematicians and practical scientists working in the field of reality are not really concerned with the details of proofs as such, and presumably ergodic theory fits into that category, but the aim of $\mathsf{Pr} \infty \mathsf{fWiki}$ is to be rigorous. --prime mover (talk) 07:44, 11 June 2022 (UTC)
OK, if I get an better idea to clean up the things, I turn back to this point. I leave some additional remarks for future:
a) Ergodic theory is usually seen as a field of pure mathematics and it is a part of probability theory.
b) In probability theory and mathematical statics, I have never seen entropy defined to the base $2$. IMHO the base should be $e$ in mathematics. Base $2$ is for computer science.
c) As long as we consider entropy (uncertainty), the base does not matter in the axiomatic sense, i.e. The entropy (uncertainty) can be defined by axioms, then it is a theorem that it has that formula that is unique up to positive multiplicative factors.
d) But in subsequent studies, if we start with entropy with base $2$ instead of base $e$, we have to consider e.g. always $2^f$ instead of $e^f$ in some dual arguments. So many things look unnatural.