Definition talk:Entropy of Finite Partition

Yes, I was aware of Definition:Entropy (Probability Theory) and Definition:Uncertainty but had not better idea. Perhaps it is not bad to have separate pages for similar concepts unless they are identical.
 * That's not how we do it on.

But if you can merge them meaningfully into one page, feel free to do this. Important for me is that the base is not fixed to be $2$. Although the choice of the base isn't a real restriction in entropy theory, choosing $2$ as base implies that we are thinking about bits in information theory. In dynamical system it is more common to choose $e$, since we also study symbolic shits with more than two symbols, like dyadic, $n$-adic .. So taking $2$ for the base of entropy is painful for me.--Usagiop (talk) 18:28, 10 June 2022 (UTC)


 * I'm not doing it, for the simple fact that I know enough about this subject to know that I don't know enough about it to know what I'm doing.
 * On the other hand:
 * a) Having two different definitions for the same thing is not an option here;
 * b) Merely leaving a statement on a talk page saying that the base of the logarithm doesn't really matter is also suboptimal. It is essential to implement a page stating and proving it.
 * So this definitely needs to be attended to at some stage.


 * It is of course appreciated that applied mathematicians and practical scientists working in the field of reality are not really concerned with the details of proofs as such, and presumably ergodic theory fits into that category, but the aim of is to be rigorous. --prime mover (talk) 07:44, 11 June 2022 (UTC)