Entropy and Microstates
Entropy and Microstates
Following the work of Carnot and Clausius, Ludwig Boltzmann developed a molecular-scale statistical model that related the entropy of a system to the number of microstates possible for the system. A microstate (W) is a specific configuration of the locations and energies of the atoms or molecules that comprise a system like the following:
\(S=k\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.2em}{0ex}}W\)
Here k is the Boltzmann constant and has a value of 1.38 \(×\) 10−23 J/K.
As for other state functions, the change in entropy for a process is the difference between its final (Sf) and initial (Si) values:
\(\text{Δ}S={S}_{\text{f}}-{S}_{\text{i}}=k\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.2em}{0ex}}{W}_{\text{f}}-k\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.2em}{0ex}}{W}_{\text{i}}=k\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.2em}{0ex}}\cfrac{{W}_{\text{f}}}{{W}_{\text{i}}}\)
For processes involving an increase in the number of microstates, Wf > Wi, the entropy of the system increases, ΔS > 0. Conversely, processes that reduce the number of microstates, Wf < Wi, yield a decrease in system entropy, ΔS < 0. This molecular-scale interpretation of entropy provides a link to the probability that a process will occur as illustrated in the next paragraphs.
Consider the general case of a system comprised of N particles distributed among n boxes. The number of microstates possible for such a system is nN. For example, distributing four particles among two boxes will result in 24 = 16 different microstates as illustrated in the figure below.
Microstates with equivalent particle arrangements (not considering individual particle identities) are grouped together and are called distributions. The probability that a system will exist with its components in a given distribution is proportional to the number of microstates within the distribution. Since entropy increases logarithmically with the number of microstates, the most probable distribution is therefore the one of greatest entropy.
The sixteen microstates associated with placing four particles in two boxes are shown. The microstates are collected into five distributions—(a), (b), (c), (d), and (e)—based on the numbers of particles in each box.
For this system, the most probable configuration is one of the six microstates associated with distribution (c) where the particles are evenly distributed between the boxes, that is, a configuration of two particles in each box. The probability of finding the system in this configuration is \(\cfrac{6}{16}\) or \(\cfrac{3}{8}.\) The least probable configuration of the system is one in which all four particles are in one box, corresponding to distributions (a) and (e), each with a probability of \(\cfrac{1}{16}.\)
The probability of finding all particles in only one box (either the left box or right box) is then \((\cfrac{1}{16}\phantom{\rule{0.2em}{0ex}}+\phantom{\rule{0.2em}{0ex}}\cfrac{1}{16})\phantom{\rule{0.2em}{0ex}}=\phantom{\rule{0.2em}{0ex}}\cfrac{2}{16}\) or \(\cfrac{1}{8}.\)
As you add more particles to the system, the number of possible microstates increases exponentially (2N). A macroscopic (laboratory-sized) system would typically consist of moles of particles (N ~ 1023), and the corresponding number of microstates would be staggeringly huge. Regardless of the number of particles in the system, however, the distributions in which roughly equal numbers of particles are found in each box are always the most probable configurations.
The previous description of an ideal gas expanding into a vacuum (see the image below) is a macroscopic example of this particle-in-a-box model. For this system, the most probable distribution is confirmed to be the one in which the matter is most uniformly dispersed or distributed between the two flasks. The spontaneous process whereby the gas contained initially in one flask expands to fill both flasks equally therefore yields an increase in entropy for the system.
An isolated system consists of an ideal gas in one flask that is connected by a closed valve to a second flask containing a vacuum. Once the valve is opened, the gas spontaneously becomes evenly distributed between the flasks.
A similar approach may be used to describe the spontaneous flow of heat. Consider a system consisting of two objects, each containing two particles, and two units of energy (represented as “*”) in the figure below. The hot object is comprised of particles A and B and initially contains both energy units. The cold object is comprised of particles C and D, which initially has no energy units.
Distribution (a) shows the three microstates possible for the initial state of the system, with both units of energy contained within the hot object. If one of the two energy units is transferred, the result is distribution (b) consisting of four microstates. If both energy units are transferred, the result is distribution (c) consisting of three microstates.
And so, we may describe this system by a total of ten microstates. The probability that the heat does not flow when the two objects are brought into contact, that is, that the system remains in distribution (a), is \(\cfrac{3}{10}.\)More likely is the flow of heat to yield one of the other two distribution, the combined probability being \(\cfrac{7}{10}.\)
The most likely result is the flow of heat to yield the uniform dispersal of energy represented by distribution (b), the probability of this configuration being \(\cfrac{4}{10}.\)As for the previous example of matter dispersal, extrapolating this treatment to macroscopic collections of particles dramatically increases the probability of the uniform distribution relative to the other distributions. This supports the common observation that placing hot and cold objects in contact results in spontaneous heat flow that ultimately equalizes the objects’ temperatures. And, again, this spontaneous process is also characterized by an increase in system entropy.
This shows a microstate model describing the flow of heat from a hot object to a cold object. (a) Before the heat flow occurs, the object comprised of particles A and B contains both units of energy and as represented by a distribution of three microstates. (b) If the heat flow results in an even dispersal of energy (one energy unit transferred), a distribution of four microstates results. (c) If both energy units are transferred, the resulting distribution has three microstates.
Example
Determination of ΔS
Consider the system shown here. What is the change in entropy for a process that converts the system from distribution (a) to (c)?
Solution
We are interested in the following change:The initial number of microstates is one, the final six:
\(\text{Δ}S=k\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.2em}{0ex}}\cfrac{{W}_{\text{c}}}{{W}_{\text{a}}}\phantom{\rule{0.2em}{0ex}}=1.38\phantom{\rule{0.2em}{0ex}}×\phantom{\rule{0.2em}{0ex}}{10}^{-23}\phantom{\rule{0.2em}{0ex}}\text{J/K}\phantom{\rule{0.2em}{0ex}}×\phantom{\rule{0.2em}{0ex}}\text{ln}\phantom{\rule{0.4em}{0ex}}\cfrac{6}{1}\phantom{\rule{0.2em}{0ex}}=2.47\phantom{\rule{0.2em}{0ex}}×\phantom{\rule{0.2em}{0ex}}{10}^{-23}\phantom{\rule{0.2em}{0ex}}\text{J/K}\)
The sign of this result is consistent with expectation; since there are more microstates possible for the final state than for the initial state, the change in entropy should be positive.
This lesson is part of:
Thermodynamics