Summarizing Statistical Interpretation of Entropy
Summary
- Disorder is far more likely than order, which can be seen statistically.
- The entropy of a system in a given state (a macrostate) can be written as
\(S=k\text{ln}W,\)
where \(k=1.38×{10}^{–23}\phantom{\rule{0.25em}{0ex}}\text{J/K}\) is Boltzmann’s constant, and \(lnW\) is the natural logarithm of the number of microstates \(W\) corresponding to the given macrostate.
Glossary
macrostate
an overall property of a system
microstate
each sequence within a larger macrostate
statistical analysis
using statistics to examine data, such as counting microstates and macrostates
This lesson is part of:
Thermodynamics
View Full Tutorial