Playing with Marbles: Structural and Thermodynamic Properties of HardSphere Systems
Abstract
These lecture notes present an overview of equilibrium statistical mechanics of classical fluids, with special applications to the structural and thermodynamic properties of systems made of particles interacting via the hardsphere potential or closely related model potentials. The exact statisticalmechanical properties of onedimensional systems, the issue of thermodynamic (in)consistency among different routes in the context of several approximate theories, and the construction of analytical or semianalytical approximations for the structural properties are also addressed.
M_Beigergb0.96 , 0.96 , 0.86 \definecolorM_Brownrgb0.65 , 0.16 , 0.16 \definecolorM_Goldrgb1.00 , 0.84 , 0.00 \definecolorM_LemonChiffonrgb1.00 , 0.98 , 0.80 \definecolorM_Orangergb1.00 , 0.60 , 0.00 \definecolorM_Pinkrgb1.00 , 0.75 , 0.80 \definecolorM_Violetrgb0.93 , 0.51 , 0.93
1 Introduction
Hardsphere systems represent a favorite playground in statistical mechanics, both in and out of equilibrium, as they represent the simplest models of manybody systems of interacting particles M08 ().
Apart from their academic or pedagogical values, hardsphere models are also important from a more practical point of view. In real fluids, especially at high temperatures and moderate and high densities, the structural and thermodynamic properties are mainly governed by the repulsive forces among molecules and in this context hardcore fluids are very useful as reference systems BH76 (); S13 ().
Moreover, the use of the hardsphere model in the realm of soft condensed matter has become increasingly popular L01 (). For instance, the effective interaction among (sterically stabilized) colloidal particles can be tuned to match almost perfectly the hardsphere model PZVSPC09 ().
As a very imperfect measure of the impact of the hardsphere model on current research, Fig. 1.1 shows the number of papers per year published in the tenyear period 2003–2012 (according to Thomson Reuters’ Web of Knowledge) that include the words “hard” and “sphere” as a topic (that is, in the title, in the abstract, or as a keyword). It can be observed that the number is rather stabilized, fluctuating around 700 papers/year. If one constrains the search criterion to papers including “hard” and “sphere” in the title, about 100 papers/year are found.
Despite the title of this work and the preceding paragraphs, the main aim of these lecture notes is neither restricted to hardsphere fluids nor focused on the “state of the art” of the field. Instead, the notes attempt to present an introduction to the equilibrium statistical mechanics of liquids and nonideal gases at a graduatestudent textbook level, with emphasis on the basics and fundamentals of the topic. The treatment uses classical (i.e., nonquantum) mechanics and no special prerequisites are required, apart from standard statisticalmechanical ensembles. Most of the content applies to any (shortrange) interaction potential, any dimensionality, and (in general) any number of components. On the other hand, some specific applications deal with the properties of fluids made of particles interacting via the hardsphere potential or related potentials. The approach is unavoidably biased toward those aspects the author is more familiarized with. Thus, important topics such as inhomogeneous fluids and density functional theory E92 (); TCM08 (); E10 (); L10b (); L10 (); R10 (), metastable glassy states PZ10 (); BB11 (); K12 (), and perturbation theories BH76 (); S13 () are not represented in these notes.
Apart from a brief concluding remark, the remainder of these lecture notes is split into the following sections: {svgraybox}

2. A Brief Survey of Thermodynamic Potentials

3. A Brief Survey of Equilibrium Statistical Ensembles

4. Reduced Distribution Functions

5. Thermodynamics from the Radial Distribution Function

6. OneDimensional Systems. Exact Solution for NearestNeighbor Interactions

7. Density Expansion of the Radial Distribution Function

8. Ornstein–Zernike Relation and Approximate Integral Equation Theories

9. Some Thermodynamic Consistency Relations in Approximate Theories

10. Exact Solution of the Percus–Yevick Equation for Hard Spheres …and Beyond
The core of the notes is made of Sects. 4, 5, 7, and 8. They start with the definition of the reduced distribution functions and, in particular, of the radial distribution function (Sect. 4), and continues with the derivation of the main thermodynamic quantities in terms of (Sect. 5). This includes the chemicalpotential route, usually forgotten in textbooks. Sections 7 and 8 are more technical. They have to do with the expansion in powers of density of and the pressure, the definition of the direct correlation function , and the construction of approximate equations of state and integralequation theories. Both sections make extensive use of diagrams but several needed theorems and lemmas are justified by simple examples without formal proofs.
In addition to the four core sections mentioned above, there are five more sections that can be seen as optional. Sections 2 and 3 are included to make the notes as selfcontained as possible and to unify the notation, but otherwise can be skipped by the knowledgeable reader. Sections 6, 9, and 10 are “side dishes.” Whereas onedimensional systems can be seen as rather artificial, it is undoubtedly important from pedagogical and illustrative perspectives to derive their exact structural and thermophysical quantities, and this is the purpose of Sect. 6. Section 9 presents three examples related to the problem of thermodynamic consistency among different routes when an approximate is employed. Finally, Sect. 10 derives the exact solution of the Percus–Yevick integral equation for hard spheres as the simplest implementation of a more general class of approximations.
2 A Brief Survey of Thermodynamic Potentials
Just to fix the notation, this section provides a summary of some of the most important thermodynamic relations.
2.1 Isolated Systems. Entropy
In a reversible process, the first and second laws of thermodynamics in a fluid mixture can be combined as Z81 (); c85 ()
(2.1) 
where is the entropy, is the internal energy, is the volume of the fluid, and is the number of particles of species . All these quantities are extensive, i.e., they scale with the size of the system. The coefficients of the differentials in (2.1) are the conjugate intensive quantities: the absolute temperature (), the pressure (), and the chemical potentials ().
Equation (2.1) shows that the natural variables of the entropy are , , and , i.e., . This implies that is the right thermodynamic potential in isolated systems: at given , , and , is maximal in equilibrium. The respective partial derivatives give the intensive quantities:
(2.2) 
The extensive nature of , , , and implies the extensivity condition . Application of Euler’s homogeneous function theorem yields
(2.3) 
Using (2.2), we obtain the identity
(2.4) 
This is the socalled fundamental equation of thermodynamics. Differentiating (2.4) and subtracting (2.1) one arrives at the Gibbs–Duhem relation
(2.5) 
Equation (2.1) also shows that , , and , are the natural variables of the internal energy , so that
(2.6) 
2.2 Closed Systems. Helmholtz Free Energy
From a practical point of view, it is usually more convenient to choose the temperature instead of the internal energy or the entropy as a control variable. In that case, the adequate thermodynamic potential is no longer either the entropy or the internal energy, respectively, but the Helmholtz free energy . It is defined from or through the Legendre transformation
(2.7) 
where in the last step use has been made of (2.4). From (2.1) we obtain
(2.8) 
so that
(2.9) 
The Helmholtz free energy is the adequate thermodynamic potential in a closed system, that is, a system that cannot exchange mass with the environment but can exchange energy. At fixed , , and , is minimal in equilibrium.
2.3 IsothermalIsobaric Systems. Gibbs Free Energy
If, instead of the volume, the independent thermodynamic variable is pressure, we need to perform a Legendre transformation from to define the Gibbs free energy (or free enthalpy) as
(2.10) 
The second equality shows that the chemical potential can be interpreted as the contribution of each particle of species to the total Gibbs free energy. The differential relations now become
(2.11) 
(2.12) 
Needless to say, is minimal in equilibrium if one fixes , , and .
2.4 Open Systems. Grand Potential
In an open system, not only energy but also particles can be exchanged with the environment. In that case, we need to replace by as independent variables and define the grand potential from via a new Legendre transformation:
(2.13) 
Interestingly, the second equality shows that is not but the pressure, except that it must be seen as a function of temperature and the chemical potentials. Now we have
(2.14) 
(2.15) 
2.5 Response Functions
We have seen that the thermodynamic variables (or ), , and appear as conjugate pairs. Depending on the thermodynamic potential of interest, one of the members of the pair acts as independent variable and the other one is obtained by differentiation. If an additional derivative is taken one obtains the socalled response functions. For example, the heat capacities at constant volume and at constant pressure are defined as
(2.16) 
(2.17) 
Analogously, it is convenient to define the isothermal compressibility
(2.18)  
and the thermal expansivity
(2.19) 
The equivalence between the second and fourth terms in (2.19) is an example of a Maxwell relation.
3 A Brief Survey of Equilibrium Statistical Ensembles
In this section a summary of the main equilibrium ensembles is presented, essentially to fix part of the notation that will be needed later on. For simplicity, we will restrict this section to onecomponent systems, although the extension to mixtures is straightforward.
Let us consider a classical system made of identical point particles in dimensions. In classical mechanics, the dynamical state of the system is characterized by the vector positions and the vector momenta . In what follows, we will employ the following shorthand notation {svgraybox}

, ,

, ,

, .
Thus, the whole microscopic state of the system (microstate) is represented by a single point in the dimensional phase space (see Fig. 3.1). The time evolution of the microstate is governed by the Hamiltonian of the system through the classical Hamilton’s equations GSP13 ().
Given the practical impossibility of describing the system at a microscopic level, a statistical description is needed. Thus, we define the phasespace probability distribution function such that is the probability that the microstate of the system lies inside an infinitesimal (hyper)volume around the phasespace point .
3.1 Gibbs Entropy
The concept of a phasespace probability distribution function is valid both out of equilibrium (where, in general, it changes with time according to the Liouville theorem B74b (); R80 ()) and in equilibrium (where it is stationary). In the latter case can be obtained for isolated, closed, open, …systems by following logical steps and starting from the equal a priori probability postulate for isolated systems. Here we follow an alternative (but equivalent) method based on informationtheory arguments R80 (); SW71 (); BN08 ().
Let us define the Gibbs entropy functional
(3.1) 
where is the Boltzmann constant and
(3.2) 
In (3.2) the coefficient is introduced to comply with Heisenberg’s uncertainty principle and preserve the nondimensional character of the argument of the logarithm, while the factorial accounts for the fact that two apparently different microstates which only differ on the particle labels are physically the same microstate (thus avoiding Gibbs’s paradox).
Equation (3.1) applies to systems with a fixed number of particles . On the other hand, if the system is allowed to exchange particles with the environment, microstates with different exist, so that one needs to define a a phasespace density for each . In that case, the entropy functional becomes
(3.3) 
Now, the basic postulate consists of asserting that, out of all possible phasespace probability distribution functions consistent with given constraints (which define the ensemble of accessible microstates), the equilibrium function is the one that maximizes the entropy functional . Once is known, connection with thermodynamics is made through the identification of as the equilibrium entropy.
3.2 Microcanonical Ensemble (Isolated System)
The microcanical ensemble describes an isolated system and thus it is characterized by fixed values of , , (the latter with a tolerance , in accordance with the uncertainty principle). Therefore, the basic constraint is the normalization condition
(3.4) 
Maximization of the entropy functional just says that for all the accessible microstates . Thus,
(3.5) 
The normalization function
(3.6) 
is the phasespace volume comprised between the hypersurfaces and . By insertion of (3.5) into (3.1) one immediately sees that is directly related to the equilibrium entropy as
(3.7) 
In this expression the specific value of becomes irrelevant in the thermodynamic limit (as long as ).
3.3 Canonical Ensemble (Closed System)
Now the system can have any value of the total energy . However, we are free to prescribe a given value of the average energy . Therefore, the constraints in the canonical ensemble are
(3.8) 
The maximization of the entropy functional subject to the constraints (3.8) can be carried out through the Lagrange multiplier method with the result
(3.9) 
where is the Lagrange multiplier associated with and the partition function is determined from the normalization condition as
(3.10) 
Substitution of (3.9) into (3.1) and use of (3.8) yields
(3.11) 
Comparison with (2.7) (where now the internal energy is represented by ) allows one to identify
(3.12) 
Therefore, in the canonical ensemble the connection with thermodynamics is conveniently established via the Helmholtz free energy rather than via the entropy.
As an average of a phasespace dynamical variable, the internal energy can be directly obtained from as
(3.13) 
Moreover, we can obtain the energy fluctuations:
(3.14) 
In the last step, use has been made of (2.16).
3.4 Grand Canonical Ensemble (Open System)
In an open system neither the energy nor the number of particles is determined but we can choose to fix their average values. As a consequence, the constraints are
(3.15) 
(3.16) 
The solution to the maximization problem is
(3.17) 
where and are Lagrange multipliers and the grand partition function is
(3.18) 
In this case the equilibrium entropy becomes
(3.19) 
From comparison with the first equality of (2.13) we can identify
(3.20) 
The average and fluctuation relations are
(3.21) 
(3.22) 
(3.23) 
The second equality of (3.23) requires the use of thermodynamic relations and mathematical properties of partial derivatives.
3.5 IsothermalIsobaric Ensemble
In this ensemble the volume is a fluctuating quantity and only its average value is fixed. Thus, similarly to the grand canonical ensemble, the constraints are
(3.24) 
(3.25) 
Not surprisingly, the solution is
(3.26) 
where is an arbitrary volume scale factor (needed to keep the correct dimensions), and are again Lagrange multipliers, and the isothermalisobaric partition function is
(3.27) 
As expected, the entropy becomes
(3.28) 
From comparison with (2.10) we conclude that
(3.29) 
The main average and fluctuation relations are
(3.30) 
(3.31) 
(3.32) 
Equations (3.23) and (3.32) are equivalent. Both show that the density fluctuations are proportional to the isothermal compressibility and decrease as the size of the system increases. In (3.23) the volume is constant, so that the density fluctuations are due to fluctuations in the number of particles, while the opposite happens in (3.32).
3.6 Ideal Gas
The exact evaluation of the normalization functions (3.6), (3.10), (3.18), and (3.27) is in general a formidable task due to the involved dependence of the Hamiltonian on the coordinates of the particles. However, in the case of noninteracting particles (ideal gas), the Hamiltonian depends only on the momenta:
(3.33) 
where is the mass of a particle. In this case the body Hamiltonian is just the sum over all the particles of the onebody Hamiltonian and the exact statisticalmechanical results can be easily obtained. The expressions for the normalization function, the thermodynamic potential, and the first derivatives of the latter for each one of the four ensembles considered above are the following ones:

Microcanonical ensemble
(3.34) (3.35) (3.36) 
Canonical ensemble
(3.37) (3.38) (3.39) 
Grand canonical ensemble
(3.40) (3.41) (3.42) 
Isothermalisobaric ensemble
(3.43) (3.44) (3.45)
In (3.37) is the oneparticle partition function and is the thermal de Broglie wavelength. In (3.40) is the fugacity. Note that (3.35), (3.38), (3.41), and (3.44) are equivalent. Likewise, (3.36), (3.39), (3.42), and (3.45) are also equivalent. This a manifestation of the ensemble equivalence in the thermodynamic limit, the only difference lying in the choice of independent and dependent variables.
3.7 Interacting Systems
Of course, particles do interact in real systems, so the Hamiltonian has the form
(3.46) 
where denotes the total potential energy. As a consequence, the partition function factorizes into its ideal and nonideal parts:
(3.47) 
The nonideal part is the configuration integral. In the canonical ensemble, is responsible for the excess contributions , , :
(3.48) 
The grand partition function does not factorize but can be written as
(3.49) 
where
(3.50) 
is a sort of modified fugacity and we have taken into account that . Thus, the configuration integrals are related to the coefficients in the expansion of the grand partition function in powers of the quantity .
4 Reduced Distribution Functions
The body probability distribution function contains all the statisticalmechanical information about the system. On the other hand, partial information embedded in marginal fewbody distributions are usually enough for the most relevant quantities. Moreover, it is much simpler to introduce useful approximations at the level of the marginal distributions than at the body level.
Let us introduce the body reduced distribution function such that is the (average) number of groups of particles such that one particle lies inside a volume around the (1body) phasespace point , other particle lies inside a volume around the (1body) phasespace point , …and so on (see Fig. 4.1 for ). More explicitly,
(4.1)  
In most situations it is enough to take and integrate out the momenta. Thus, we define the configurational twobody distribution function as
(4.2) 
Obviously, its normalization condition is
(4.3) 
The importance of arises especially when one is interested in evaluating the average of a dynamical variable of the form
(4.4) 
In that case, it is easy to obtain
(4.5) 
The quantities (4.1) and (4.2) can be defined both out of and in equilibrium. In the latter case, however, we can benefit from the (formal) knowledge of . In particular, in the canonical ensemble [see (3.9) and (3.47)] one has
(4.6) 
In the absence of interactions (),
(4.7) 
In the grand canonical ensemble the equations equivalent to (4.3), (4.6), and (4.7) are
(4.8) 
(4.9) 
(4.10) 
4.1 Radial Distribution Function
Taking into account (4.7) and (4.10), we define the pair correlation function by
(4.11) 
Thus, according to (4.6),
(4.12) 
in the canonical ensemble.
Now, taking into account the translational invariance property of the system, one has . Moreover, a fluid is rotationally invariant, so that (assuming central forces), , where is the distance between the points and . In such a case, the function is called radial distribution function and will play a very important role henceforth.
An interesting normalization relation holds in the grand canonical ensemble. Inserting (4.11) into (4.8) we get
(4.13) 
In the thermodynamic limit ( and with ), we know that [see (3.23)] (except near the critical point, where diverges). This implies that , meaning that for macroscopic distances , which are those dominating the value of the integral. In other words, .
Apart from the formal definition provided by (4.11) and (4.12), it is important to have a more intuitive physical interpretation of . Two simple equivalent interpretations are: {svgraybox}

is the probability of finding a particle at a distance away from a given reference particle, relative to the probability for an ideal gas.

If a given reference particle is taken to be at the origin, then the local average density at a distance from that particle is .
Figure 4.2 illustrates the meaning of and depicts the typical shape of the function for a (threedimensional) fluid of particles interacting via the LennardJones (LJ) potential
(4.14) 
at the reduced temperature and the reduced density . The LennardJones potential is characterized by a scale distance and well depth , and is repulsive for and attractive for . As we see from Fig. 4.2, is practically zero in the region (due to the strongly repulsive force exerted by the reference particle at those distances), presents a very high peak at , oscillates thereafter, and eventually tends to unity for long distances as compared with . Thus, a liquid may present a strong structure captured by .
Some functions related to the radial distribution function can be defined. The first one is simply the socalled total correlation function
(4.15) 
Its Fourier transform
(4.16) 
is directly connected to the (static) structure factor:
(4.17) 
The typical shape of at several densities is illustrated in Fig. 4.3 for the hardsphere (HS) potential M08 ()
(4.18) 
where is the diameter of the spheres.
The structure factor is a very important quantity because it is experimentally accessible by elastic scattering of radiation (xrays or neutrons) by the fluid B74b (); HM06 (). Thus, while can be measured directly in simulations (either Monte Carlo or molecular dynamics) AT87 (); FS02 (), it can be obtained indirectly in experiments from a numerical inverse Fourier transform of .
5 Thermodynamics from the Radial Distribution Function
As shown by (3.7), (3.12), (3.20), and (3.29), the knowledge of any of the ensemble normalization functions allows one to obtain the full thermodynamic information about the system. But now imagine that instead of the normalization function (for instance, the partition function in the canonical ensemble), we are given (from experimental measures, computer simulations, or a certain theory) the radial distribution function . Can we have access to thermodynamics directly from ? As we will see in this section, the answer is affirmative in the case of pairwise interactions.
5.1 Compressibility Route
The most straightforward route to thermodynamics from is provided by choosing the grand canonical ensemble and simply combining (3.23) and (4.13) to obtain
(5.1) 
where is the isothermal susceptibility and we recall that the total correlation function is defined by (4.15) and in the last step use has been made of (4.17). Therefore, the zero wavenumber limit of the structure factor (see Fig. 4.2) is directly related to the isothermal compressibility.
Equation (5.1) is usually known as the compressibility equation of state or the compressibility route to thermodynamics.
5.2 Energy Route
Equation (5.1) applies regardless of the specific form of the potential energy function . From now on, however, we assume that the interaction is pairwise additive, i.e., can be expressed as a sum over all pairs of a certain function (interaction potential) that depends on the distance between the two particles of the pair. In mathematical terms,
(5.2) 
We have previously encountered two particular examples [see (4.14) and (4.18)] of interaction potentials.
The pairwise additivity condition (5.2) implies that is a dynamical variable of the form (4.4). As a consequence, we can apply the property (4.5) to the average potential energy:
(5.3) 
Adding the idealgas term [see (3.39)] and taking into account (4.11), we finally obtain
(5.4) 
where we have used the general property , being an arbitrary function.
Equation (5.4) defines the energy route to thermodynamics. It can be equivalently written in terms of the socalled cavity function