Thermodynamic derivation and use of a nonequilibrium canonical ensemble

Thermodynamic derivation and use of a nonequilibrium canonical ensemble

Maarten H. P. Ambaum Department of Meteorology, University of Reading, U.K.
July 20, 2019

A thermodynamic expression for the analog of the canonical ensemble for nonequilibrium systems is described based on a purely information theoretical interpretation of entropy. As an application, it is shown that this nonequilibrium canonical distribution implies some important results from nonequilibrium thermodynamics, specifically, the fluctuation theorem and the Jarzynski-equality.

preprint: APS/123-QED

In this letter we demonstrate that the information-theoretical definition of entropy implies some important results in non-equilibrium thermodynamics, such as the fluctuation theorem and the Jarzynski-equation. The central tenet is that for two states and of a system, defined by two sets of macroscopic parameters, the ratio of the probabilities for the system to be in either state is


with difference in entropy between the states and . This is essentially the Boltzmann definition of entropy, combined with the observation that in the absence of any other information, each state is assumed to have equal probability. The latter assumption is in an information-theoretical setting equivalent to the principle of indifference: the absence of any distinguishing information is equivalent to equal prior (prior to obtaining additional information) probabilities prior ().

Following Boltzmann, we define the entropy as the logarithm of the number of states accessible to a system under given macroscopic constraints. For an isolated system, the entropy is related to the size of the accessible phase space,


For a classical system, the phase space size is the hyper-area of the energy shell, and it defines the usual microcanonical ensemble. The hyper-area is non-dimensionalised such that is proportional to the number of states between energies and . We will not consider other required factors which make the argument of the logarithm non-dimensional; these contribute an additive entropy constant which will not be of interest to us here. Note also that the microcanonical ensemble does not include a notion of equilibrium: the system is assumed to be insulated so it cannot equilibrate with an external system. It just moves around on the energy shell and the assumed ergodicity implies that all states, however improbable from a macoscopic point of view, are members of the ensemble. Of course, the number of unusual states (say, with non-uniform macroscopic density) is much lower than the number of regular states (say, with uniform macroscopic density) for macroscopic systems. Only for small systems, the distinction becomes important but it does not invalidate the above formal definition of entropy. The above definition of entropy also ensures that entropy is an extensive property such that for two independent systems considered together the total entropy is the sum of the individual entropies, . The Boltzmann constant ensures dimensional compatibility with the classical thermodynamic entropy when the usual equilibrium assumptions are made jaynes1 (); *gibbsvboltzmann.

The hyper-area of the energy shell, and thus the entropy, can be a function of several variables which are set as external constraints, such as the total energy , system volume, , or particle number . For the canonical ensemble we consider a system that can exchange energy with some reservoir. We consider here only a theoretical canonical ensemble in that we consider the coupling between the two systems to be weak.

First, we need to define what a reservoir is. Following equilibrium thermodynamics, we formally define an ‘inverse temperature’ as


We make no claim about the equality of and the classical equilibrium inverse temperature; is the expansivity of phase space with energy and as such can be defined for any system, whether it is in thermodynamic equilibrium or not. When an isolated system is prepared far from equilibrium (for example, when it has a local equilibrium temperature which varies over the system) then is still uniquely defined for the system as a non-local property of the energy shell that the system resides on. Because both energy and entropy in the weak coupling limit are extensive quantities, must be an intensive quantity.

Now consider a large isolated system with total (internal) energy . Let this system receive energy from the environment. By expanding in powers of , we can then write the entropy of this large system as


We see that for finite has to be an extensive quantity. But that means that for a very large system where is a measure of the size of the system (such as particle number). For a classical thermodynamic system with the heat capacity at constant volume. We conclude that for a very large system (), the entropy equals


for all relevant, finite energy exchanges . This expression for the entropy defines a reservoir. The size of the energy shell accessible to the reservoir is, for all relevant energy exchanges , exactly proportional to with an intensive and constant property of the reservoir. We do not require the reservoir to be in thermodynamic equilibrium. A change of energy in the reservoir pushes the reservoir to a different energy shell; the functional dependence of the size of the energy shell with energy defines the ‘inverse temperature’ , as in Eq. 3. However, it is not assured that a small and fast thermometer would measure an inverse temperature equal to at some point in the reservoir; only if the reservoir is allowed to equilibrate, its inverse temperature is everywhere equal to . Of course, this is precisely how the temperature of a reservoir is determined in practice.

Now suppose a system of interest has energy . We then allow it to exchange heat with a reservoir. If the system has energy , the reservoir must have given up energy . We can write the hyper-area of the energy shell of the system as a function of . The total entropy of the system plus reservoir can then be written as


with . The number of states at each level of exchange energy therefore is proportional to


where we omitted proportionality constants related to the additive entropy constants. Nowhere we assume that the system is in equilibrium with the reservoir. This means that is the relevant measure to construct an ensemble average for the system, even for far-from-equilibrium systems. Even the reservoir can be out of equilibrium, as discussed above. We have also made no reference to the size of the system of interest, as long as it is much smaller than the reservoir. However, in contrast to systems in thermodynamic equilibrium, there is no guarantee that the extensive variables, such as , , or define the state of the system in any reproducible sense. To fully define an out-of-equilibrium system we need to introduce order parameters that can describe the non-equilibrium aspects of the system.

The above density is an integrated version of the usual canonical distribution. The size of the energy shell of the system of interest, , can be written as an integral over states such that


with the Hamiltonian of the system of interest . With this definition, the density in Eq. 7 reduces to the usual canonical distribution for states . We will not make further use of this microscopic version of the density.

The canonical density in Eq. 7 can be expanded by parametrizing each energy shell with some continuous coordinate so that every part of phase space has coordinates . At each value of the differential is proportional to the number of states between coordinate values and , and and , and it is normalised such that


The parametrisation is arbitrary at this point and can be chosen such as to divide the phase space in as fine a structure as desired for a given application. We can define an entropy again as the logarithm of the number of available states for the system of interest corresponding to point ,


Now consider a process that occurs on the energy shell where some variable changes from . On the parametrized energy shell this corresponds to a coordinate shift from . The number of corresponding states changes from . We can use detailed balance to express the ratio of the probability of making this transition to the probability of making the reverse transition as the ratio of the number of states at to the number of states at :


where . The Liouville theorem implies that the local phase space volume is conserved when a system moves on the energy shell, so that has to vanish for every realisable transformation. However, this is not the case anymore if the system moves between energy shells. If, in addition, during the process the energy of the system of interest changes from through exchange with the reservoir, then the above ratio of probabilities can still be expressed as but now with


We can always write the entropy change of the system of interest as the sum of the entropy change due to heat exchange with the reservoir and an irreversible entropy change associated with uncompensated heat kondepudi (); *ambaum, viz. . We thus conclude that , that is, the relevant entropy change in Eq. 11 equals the irreversible entropy change of the system of interest. So for processes that occur either on or across energy shells, we have


with the irreversible entropy change of the system in a process The right-hand-side of this equation is only dependent on the irreversible entropy change between the two states of the system of interest. So this equation must be true for any pair of states that are related by the same irreversible entropy change. We thus arrive at the fluctuation theorem evans1 (); *evans,


with the probability the system of interest makes a transition with irreversible entropy change of and the probability for the opposite change.

The fluctuation theorem applies to spontaneous processes that occur in thermostatted but otherwise isolated systems. We next consider processes that occur when we modify the system of interest by changing some external macroscopic parameters. The entropy of the energy shell is then also a function of some parameter , viz., . Without loss of generality we set at and at . In this case the irreversible entropy change in, Eq. 13, is


Apart from this, there is no change in the considerations leading to the fluctuation theorem. By definition, thermostatted systems that receive work from their environment have an irreversible entropy change equal to


with the change in free energy going from to . Recognising that the right-hand-side is again only a function of the difference between the two states, we arrive at the Crooks fluctuation theorem crooks (),


with the probability that the system absorbs work when changes from 0 to 1, and the probability that the system performs work when changes in reverse from 1 to 0. Because the transition probabilities can be normalised with respect to the exchanged work, it is straightforward to use this equation to show that the expectation value of equals unity, or equivalently,


This is the Jarzynski equation jarzynski ().

The consistency of the above argument is strengthened by the following independent route to calculate free energy changes. The phase space measure can be normalised with the partition function ,


where is proportional to the number of accessible states of the isolated the system of interest when the external parameter is set to . The equilibrium free energy for the thermostated system is


Next we consider what happens to the equilibrium free energy of the system when we vary from 0 to 1. The partition function at satisfies


where denotes an ensemble average over the initial ensemble, and . As before, the entropy change can be written as the sum of the entropy change due to heat exchange with the reservoir and the irreversible entropy change due to uncompensated heat. Because the system plus the reservoir are thermally insulated, any heat given to the reservoir must be compensated by work performed by the external parameter change. The entropy change, above can therefore be written as so that we find


Because Eq. 16 is true for any microscopic realisation of the process, we find that the right-hand-side of the above equation is the same for every realisation and it is equal to . This is consistent with the equilibrium expression for the free energy, Eq. 20, from which folllows that The above equation is only apparently in contradiction to the Jarzynski equation, Eq. 18. To arrive at the Jarzynski equation we recognise that Eq. 16 implies that where the last equality follows from integrating the fluctuation theorem over all values of .


  • (1) E. T. Jaynes, IEEE Transactions On Systems Science and Cybernetics 4, 227 (1968)
  • (2) E. T. Jaynes, The Physical Review 4, 620 (1957)
  • (3) E. T. Jaynes, American Journal of Physics 5, 391 (1965)
  • (4) D. Kondepudi and I. Prigogine, Modern thermodynamics (J. Wiley & Sons, Chichester, 1998)
  • (5) M. H. P. Ambaum, Thermal physics of the atmosphere (Wiley–Blackwell, Chichester, 2010)
  • (6) D. J. Evans, E. G. D. Cohen, and G. P. Morriss, Phys. Rev. Lett. 71, 2401 (Oct 1993)
  • (7) D. J. Evans and D. J. Searles, Advances in Physics 51, 1529 (2002)
  • (8) G. E. Crooks, Phys. Rev. E 60, 2721 (Sep 1999)
  • (9) C. Jarzynski, Physical Review Letters 78, 2690 (1997)
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description