# Definition:Entropy (Physics)

Jump to navigation
Jump to search

## Definition

**Entropy** is a property of a thermodynamic system.

It quantifies the number $\Omega$ of microstates that are consistent with the macroscopic quantities that characterize the system.

The **entropy** of a system is equal to the expectation of the value:

- $k \ln P$

where:

- $k$ is a constant which relates the mean kinetic energy and absolute temperature of the system
- $P$ is the coefficient of probability of the system.

This article is complete as far as it goes, but it could do with expansion.In particular: once we have gone some way into covering thermodynamicsYou can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by adding this information.To discuss this page in more detail, feel free to use the talk page.When this work has been completed, you may remove this instance of `{{Expand}}` from the code.If you would welcome a second opinion as to whether your work is correct, add a call to `{{Proofread}}` the page. |

## Sources

- 1989: Ephraim J. Borowski and Jonathan M. Borwein:
*Dictionary of Mathematics*... (previous) ... (next): Entry:**entropy**:**2.** - 1992: Frederick W. Byron, Jr. and Robert W. Fuller:
*Mathematics of Classical and Quantum Physics*... (previous) ... (next): Volume One: Chapter $1$ Vectors in Classical Physics: $1.1$ Geometric and Algebraic Definitions of a Vector