![]() | ||
In statistical mechanics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, the number of real microstates corresponding to the gas' macrostate:
Contents
where kB is the Boltzmann constant (also written with k), which is equal to 1.38065 × 10−23 J/K.
In short, the Boltzmann formula shows the relationship between entropy and the number of ways the atoms or molecules of a thermodynamic system can be arranged. In 1934, Swiss physical chemist Werner Kuhn successfully derived a thermal equation of state for rubber molecules using Boltzmann's formula, which has since come to be known as the entropy model of rubber.
History
The equation was originally formulated by Ludwig Boltzmann between 1872 and 1875, but later put into its current form by Max Planck in about 1900. To quote Planck, "the logarithmic connection between entropy and probability was first stated by L. Boltzmann in his kinetic theory of gases".
The value of
where i ranges over all possible molecular conditions and
Generalization
Boltzmann's formula applies to microstates of the universe as a whole, each possible microstate of which is presumed to be equally probable.
But in thermodynamics it is important to be able to make the approximation of dividing the universe into a system of interest, plus its surroundings; and then to be able to identify the entropy of the system with the system entropy in classical thermodynamics. The microstates of such a thermodynamic system are not equally probable—for example, high energy microstates are less probable than low energy microstates for a thermodynamic system kept at a fixed temperature by allowing contact with a heat bath. For thermodynamic systems where microstates of the system may not have equal probabilities, the appropriate generalization, called the Gibbs entropy, is:
This reduces to equation (1) if the probabilities pi are all equal.
Boltzmann used a
Boltzmann himself used an expression equivalent to (3) in his later work and recognized it as more general than equation (1). That is, equation (1) is a corollary of equation (3)—and not vice versa. In every situation where equation (1) is valid, equation (3) is valid also—and not vice versa.
Boltzmann entropy excludes statistical dependencies
The term Boltzmann entropy is also sometimes used to indicate entropies calculated based on the approximation that the overall probability can be factored into an identical separate term for each particle—i.e., assuming each particle has an identical independent probability distribution, and ignoring interactions and correlations between the particles. This is exact for an ideal gas of identical particles, and may or may not be a good approximation for other systems.
The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic system as statistically independent. The probability distribution of the system as a whole then factorises into the product of N separate identical terms, one term for each particle; and the Gibbs entropy simplifies to the Boltzmann entropy
where the summation is taken over each possible state in the 6-dimensional phase space of a single particle (rather than the 6N-dimensional phase space of the system as a whole).
This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872. For the special case of an ideal gas it exactly corresponds to the proper thermodynamic entropy.
However, for anything but the most dilute of real gases, it leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must follow Gibbs, and consider the ensemble of states of the system as a whole, rather than single particle states.