Definition:Information (Estimation Theory)
Jump to navigation
Jump to search
Definition
Let $L$ be the logarithm of the likelihood function of a parameter $\theta$.
The amount of information of $\theta$ is given by:
- $I := \map E {\paren {\dfrac {\partial L} {\partial \theta} }^2}$
![]() | This article, or a section of it, needs explaining. In particular: What is $E$ in this context? Is it expectation? If so, use $\expect {\paren {\dfrac {\partial L} {\partial \theta} }^2}$ instead. You can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by explaining it. To discuss this page in more detail, feel free to use the talk page. When this work has been completed, you may remove this instance of {{Explain}} from the code. |
Also see
- Results about information in the context of estimation theory can be found here.
Sources
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): information: 2.
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): information: 2.