Disorder in a Gas

Disorder in a Gas

The fantastic growth in the odds favoring disorder that we see in going from 5 to 100 coins continues as the number of entities in the system increases. Let us now imagine applying this approach to perhaps a small sample of gas. Because counting microstates and macrostates involves statistics, this is called statistical analysis. The macrostates of a gas correspond to its macroscopic properties, such as volume, temperature, and pressure; and its microstates correspond to the detailed description of the positions and velocities of its atoms. Even a small amount of gas has a huge number of atoms: \(1\text{.}0{\text{cm}}^{3}\) of an ideal gas at 1.0 atm and \(0º\text{ C}\) has \(2\text{.}7×{\text{10}}^{\text{19}}\) atoms. So each macrostate has an immense number of microstates. In plain language, this means that there are an immense number of ways in which the atoms in a gas can be arranged, while still having the same pressure, temperature, and so on.

The most likely conditions (or macrostates) for a gas are those we see all the time—a random distribution of atoms in space with a Maxwell-Boltzmann distribution of speeds in random directions, as predicted by kinetic theory. This is the most disorderly and least structured condition we can imagine. In contrast, one type of very orderly and structured macrostate has all of the atoms in one corner of a container with identical velocities. There are very few ways to accomplish this (very few microstates corresponding to it), and so it is exceedingly unlikely ever to occur. (See figure (b) below.) Indeed, it is so unlikely that we have a law saying that it is impossible, which has never been observed to be violated—the second law of thermodynamics.

The disordered condition is one of high entropy, and the ordered one has low entropy. With a transfer of energy from another system, we could force all of the atoms into one corner and have a local decrease in entropy, but at the cost of an overall increase in entropy of the universe. If the atoms start out in one corner, they will quickly disperse and become uniformly distributed and will never return to the orderly original state (see figure (b) above). Entropy will increase. With such a large sample of atoms, it is possible—but unimaginably unlikely—for entropy to decrease. Disorder is vastly more likely than order.

The arguments that disorder and high entropy are the most probable states are quite convincing. The great Austrian physicist Ludwig Boltzmann (1844–1906)—who, along with Maxwell, made so many contributions to kinetic theory—proved that the entropy of a system in a given state (a macrostate) can be written as

\(S=k\text{ln}W\text{,}\)

where \(k=1\text{.}\text{38}×{\text{10}}^{-\text{23}}\phantom{\rule{0.25em}{0ex}}\text{J/K}\) is Boltzmann’s constant, and \(\text{ln}W\) is the natural logarithm of the number of microstates \(W\) corresponding to the given macrostate. \(W\) is proportional to the probability that the macrostate will occur. Thus entropy is directly related to the probability of a state—the more likely the state, the greater its entropy. Boltzmann proved that this expression for \(S\) is equivalent to the definition \(\Delta S=Q/T\), which we have used extensively.

Thus the second law of thermodynamics is explained on a very basic level: entropy either remains the same or increases in every process. This phenomenon is due to the extraordinarily small probability of a decrease, based on the extraordinarily larger number of microstates in systems with greater entropy. Entropy can decrease, but for any macroscopic system, this outcome is so unlikely that it will never be observed.

Example: Entropy Increases in a Coin Toss

Suppose you toss 100 coins starting with 60 heads and 40 tails, and you get the most likely result, 50 heads and 50 tails. What is the change in entropy?

Strategy

Noting that the number of microstates is labeled \(W\) in this table for the 100-coin toss, we can use \(\Delta S={S}_{\text{f}}-{S}_{\text{i}}=k\text{ln}{W}_{\text{f}}-k\text{ln}{W}_{\text{i}}\) to calculate the change in entropy.

Solution

The change in entropy is

\(\Delta S={S}_{\text{f}}–{S}_{\text{i}}=k\text{ln}{W}_{\text{f}}–k\text{ln}{W}_{\text{i},}\)

where the subscript i stands for the initial 60 heads and 40 tails state, and the subscript f for the final 50 heads and 50 tails state. Substituting the values for \(W\) from this table gives

\(\begin{array}{lll}\Delta S& =& (1\text{.}\text{38}×{\text{10}}^{–\text{23}}\phantom{\rule{0.25em}{0ex}}\text{J/K})[\text{ln}(1\text{.}0×{\text{10}}^{\text{29}})–\text{ln}(1\text{.}4×{\text{10}}^{\text{28}})]\\ & =& \text{2.7}×{\text{10}}^{–\text{23}}\phantom{\rule{0.25em}{0ex}}\text{J/K}\end{array}\)

Discussion

This increase in entropy means we have moved to a less orderly situation. It is not impossible for further tosses to produce the initial state of 60 heads and 40 tails, but it is less likely. There is about a 1 in 90 chance for that decrease in entropy (\(–2\text{.}7×{\text{10}}^{–\text{23}}\phantom{\rule{0.25em}{0ex}}\text{J/K}\)) to occur. If we calculate the decrease in entropy to move to the most orderly state, we get \(\Delta S=–\text{92}×{\text{10}}^{–\text{23}}\phantom{\rule{0.25em}{0ex}}\text{J/K}\). There is about a \(1\text{in}{\text{10}}^{\text{30}}\) chance of this change occurring. So while very small decreases in entropy are unlikely, slightly greater decreases are impossibly unlikely. These probabilities imply, again, that for a macroscopic system, a decrease in entropy is impossible.

For example, for heat transfer to occur spontaneously from 1.00 kg of \(0º\text{C}\) ice to its \(0º\text{C}\) environment, there would be a decrease in entropy of \(1\text{.}\text{22}×{\text{10}}^{3}\phantom{\rule{0.25em}{0ex}}\text{J/K}\). Given that a \(\Delta S{\text{of 10}}^{–\text{21}}\phantom{\rule{0.25em}{0ex}}\text{J/K}\) corresponds to about a \(1\text{in}{\text{10}}^{\text{30}}\) chance, a decrease of this size (\({\text{10}}^{3}\phantom{\rule{0.25em}{0ex}}\text{J/K}\)) is an utter impossibility. Even for a milligram of melted ice to spontaneously refreeze is impossible.

Problem-Solving Strategies for Entropy

  1. Examine the situation to determine if entropy is involved.
  2. Identify the system of interest and draw a labeled diagram of the system showing energy flow.
  3. Identify exactly what needs to be determined in the problem (identify the unknowns). A written list is useful.
  4. Make a list of what is given or can be inferred from the problem as stated (identify the knowns). You must carefully identify the heat transfer, if any, and the temperature at which the process takes place. It is also important to identify the initial and final states.
  5. Solve the appropriate equation for the quantity to be determined (the unknown). Note that the change in entropy can be determined between any states by calculating it for a reversible process.
  6. Substitute the known value along with their units into the appropriate equation, and obtain numerical solutions complete with units.
  7. To see if it is reasonable: Does it make sense? For example, total entropy should increase for any real process or be constant for a reversible process. Disordered states should be more probable and have greater entropy than ordered states.

This lesson is part of:

Thermodynamics

View Full Tutorial

Track Your Learning Progress

Sign in to unlock unlimited practice exams, tutorial practice quizzes, personalized weak area practice, AI study assistance with Lexi, and detailed performance analytics.