Uncertainty scars and the distance from equilibrium

Uncertainty scars and the distance from equilibrium

Schuyler B. Nicholson schuyler.nicholson@umb.edu Department of Chemistry, University of Massachusetts Boston, Boston, MA 02125
July 14, 2019
Abstract

With a statistical measure of distance, we derive a classical uncertainty relation for processes traversing nonequilibrium states both transiently and irreversibly. This relationship stems from the link between the Fisher information and an entropic acceleration. Our measure of uncertainty is a lower bound for the rate of change in the entropy production and flow, which in turn can be interpreted as an average local change in information. The consequences of these results are that while the uncertainty about the nonstationary path to equilibrium (or steady states) is generally positive, there exists paths with a constant entropic acceleration and zero uncertainty. These findings demonstrate that the shortest distance to equilibrium does not necessarily correlate with the uncertainty about the path taken by the system. For example, in a driven version of Onsager’s three-state model, we show that a set of high-uncertainty initial-conditions, some of which are near equilibrium, scar the state space.

Introduction.– Myriad phenomena generate structures and patterns that are unique outside of thermodynamic equilibrium. Efforts to understand these processes stretch back to the very beginnings of thermodynamics – a pinnacle of physics that encapsulates the quantitative understanding of energy transfer and transformations Callen (1985). One interesting approach to studying thermodynamic processes is through classical uncertainty principles Mandelbrot (1956); Schlögl (1988); Uffink and van Lith (1999), the principles more traditionally considered as the cornerstone of quantum mechanics. These thermal uncertainty relations have strong resemblances to their quantum counterparts and rest on the foundations of equilibrium statistical mechanics. The recent introduction of non-equilibrium uncertainty relations Barato and Seifert (2015); Gingrich et al. (2016) has generated a flurry of activity Maes (2017); Horowitz and Gingrich (2017); Pietzonka et al. (2017); Shiraishi et al. (2016); Shiraishi (2017); Proesmans and Van den Broeck (2017); Dechant and Sasa (2017), but these results are largely restricted to nonequilibrium steady-states. They leave open the question of whether there are uncertainty relations for processes that are transient and nonstationary. We address this question here.

There is a growing link between thermodynamics and information through discoveries, akin to fluctuations theorems Jarzynski (1997); Seifert (2012); Kawai et al. (2007); A. E. Allahverdyan and Mahler (2009), that place bounds Hartich et al. (2014); Horowitz and Esposito (2014); Yamamoto et al. (2016) on entropy changes J. M. R. Parrondo and Sagawa (2015). In prototypical systems, such as Maxwell’s demon, a system can execute a computation. These computations may also involve thermodynamic function and, so, perform useful work and dissipate heat Boyd and Crutchfield (2016). Since information is a probabilistic quantity Cover and Thomas (2006), having a firm grasp on the underlying distributions sampled by nonequilibrium processes is an important foundation from which to elucidate links between information and thermodynamics. For non-stationary systems, a natural route is to study how distributions change as the control parameters of the system are varied. These ideas are formalized in information geometry Amari and Nagaoka (2007); Brody and Rivier (1995); Heseltine and Kim (2016); Nicholson and Kim (2016a); Oizumi et al. (2016). Through this lens, the system dynamics traces trajectories across a manifold of probability distributions. The structure of the manifold and the distance and velocity across it are the primary focus. Though often presented in a general setting Amari and Nagaoka (2007), information geometry has connections to thermodynamics Weinhold (1975); Crooks (2007); Ruppeiner (1979); Sivak and Crooks (2012); Lahiri et al. (2016). For non-stationary irreversible processes, there are few results, however, and our understanding between thermodynamics and information geometry remains incomplete.

A significant challenge to the development of a statistical-mechanical theory for non-stationary processes is that there are few restrictions on the possible nonequilibrium distributions over paths or states. Here, connecting the acceleration of the entropy to the Fisher information enables us to bring the mathematical machinery of information geometry to bear on the problem. This step leads naturally to the quantification of the uncertainty about the path a system may take through successive nonequilibrium states. We show that this uncertainty is a lower bound on the average cumulative rate of information change, a quantity that in turn is driven by the entropic acceleration. Relating the uncertainty of a system to the change in information content sheds new light on the connection between thermodynamics and information for far from equilibrium systems. Most of this recent work focuses on non-equilibrium steady states, less being known about the non-stationary processes we focus on here. The theory we present reveals trajectories that have zero uncertainty, despite being non-stationary and irreversible. The significance of these paths are that they are created when the system follows the geodesic connecting any initial and final distribution.

Notation and setting.– At the ensemble level, a path is the set of probability distributions a system samples as it evolves over a finite time interval. We define the set of probability distributions . A subset of these distributions belong to the manifold , where represents the time-dependent control parameters Amari and Nagaoka (2007). These control parameters determine the path across the manifold and the transformation of the initial distribution, , into the final distribution, , over the time interval . Here, the system dynamics are given by the master equation

(1)

where is the time derivative of and is the transition rate from state . The occupation probability for state is conditional on the control parameters . The rate matrix also depends on and follows the usual conventions: for , for and so that . For convenience, we will suppress this dependence, , in our notation.

A system satisfies detailed balance if the currents (or thermodynamic fluxes)

(2)

are zero for all . Otherwise, the existence of current implies the system is undergoing an irreversible process Schnakenberg (1976). The current is related to the master equation, , but does not satisfy the requirements of a metric. Loosely speaking, it cannot be used directly to quantify a path across . However, it is well known that the Fisher information can be used to form a metric Rao (1945).

To form a link between changes in information of the system and uncertainty, we define

(3)

The time reversed dynamics are defined by . If the system satisfies detailed balance, is equivalent to . Even when the current is non-zero, this matrix satisfies a detailed balance condition, . It is also similar to a symmetric matrix and, thus, has a complete set of eigenvectors and real eigenvalues Horn and Johnson (2012). Matrices with a similar form and function are known for discrete-time, discrete-state Markov chains Nicholson et al. (2013); Nicholson and Kim (2016b), but not for continuous-time Markovian dynamics. As we will show, allows us to connect, the Fisher information (from information geometry) to the entropic acceleration, which in turn is represented as an average change in information of the system.

Fisher information and thermodynamics.– The Fisher matrix Amari and Nagaoka (2007),

(4)

is a metric tensor that gives a statistical measure of distance over a manifold of probability distributions,

(5)

The Fisher information, , relates how probability distributions change with respect to a set of control parameters Frieden (2004). When parametrized by time it is

(6)

As defined, the Fisher information is nonzero and simply a mathematical construction. To help elucidate the connection between information and thermodynamics, we will connect the Fisher information to the average rate of information change, which will in turn be related to the entropy production, and the entropy exchanged with the environment. Start by considering the rate of change of the Shannon entropy

(7)

The average is equivalent to an average over the current (up to a factor of Esposito and Van den Broeck (2010). The connection to nonequilibrium thermodynamics comes from decomposing the change in entropy,

(8)

into the entropy production rate of sources within the system, , and the rate of entropy exchange with the environment,  Seifert (2005). The entropy production can be written as an average of the generalized forces,  Esposito and Van den Broeck (2010), which multiplied by , Boltzmann’s constant are the thermodynamic affinities Schnakenberg (1976), here we work with .

To provide a link between the Fisher information and thermodynamics that is valid for nonstationary irreversible Markovian processes. First, note that by conservation of probability, where and is the standard bra and ket notation. Taking the time derivative gives

(9)

Second, the matrix can be written in terms of the generalized thermodynamic forces

(10)

using . Finally, taking the derivative of connects Eq. (9) and Eq. (10) and leads to our first main result

(11)

Shannon Shannon (1948) showed that is the information associated with state . The difference is then the local difference in information between state and . The Fisher information can then be seen as an average over the rate of information change in the system,

(12)

or, in terms of thermodynamics, the average entropic acceleration. The latter average can be decomposed into the system and the environment . This direct relationship, independently derived in Ito (2017), between the Fisher information and the entropic acceleration is valid at every moment in time. The non-negativity of requires . By introducing a measure of uncertainty over a path across , and combining with Eq. (12), we will show next, that the cumulative rate of information change is bounded from below by the uncertainty of the path.

Uncertainty and deviations from the geodesic.– As shown by Rao Rao (1945), the Fisher information matrix satisfies the requirements of a metric and, so, the Fisher information relates to the line element between two distributions infinitesimally displaced from one another, . The length, , of a path on the manifold can then be measured with the statistical distance Wootters (1981)

(13)

The Cauchy-Schwarz inequality yields the statistical divergence

(14)

Previous work has shown that is a temporal variance Heseltine and Kim (2016); Nicholson and Kim (2016a), and that, in one representation, can measure cumulative fluctuations in the rate coefficients for irreversible decay processes Flynn et al. (2014); Nichols et al. (2015).

In the current context, the Cauchy-Schwarz inequality is the variance or “uncertainty” of the path connecting and . To see this interpretation, we define the time average for a function, , as . Taking the time average of and the squared time-average of over the path gives

(15)

Subtracting these terms gives the time-averaged variance

(16)

The variance, , will depend on the path, as well as the initial and final distributions. We expect it to be nonzero for most irreversible processes. One notable exception are paths following the geodesic that connects two distributions. These paths correspond to the condition  Wootters (1981) and a variance of zero. These “certain” paths are irreversible, non-stationary paths with zero uncertainty in the sense here.

It has previously been shown that measuring cumulative deviations from the geodesic amount to measuring the cumulative fluctuations in nonequilibrium observables Flynn et al. (2014); Nichols et al. (2015); Sivak and Crooks (2012). Past work has also used statistical distances (though with other metrics) to measure the dissipation associated with quasistatic transformations Salamon and Berry (1983). These results, however, do not connect thermodynamic quantities such as the entropy production and flow to the Fisher information for general non-stationary irreversible processes as we do here.

Figure 1: The probability manifold with color indicating the ratio of and for that initial condition when, (a) , (b) , where domains of initial conditions have a information and uncertainty ratio, and (c) , which is the minimum time for all initial conditions to be with in of (open circle).

Our second main result, and the physical insight into the certain paths, is a bound on the entropic accelerations in terms of the uncertainty of the path. Start from the variance

(17)

The third line uses and the non-negativity of the variance. From this inequality and Eq. (12), we have

(18)

Defining , this relation can be more simply written as

(19)

This information-uncertainty ratio is valid for non-stationary irreversible paths over any finite time interval between arbitrary probability distributions: The uncertainty then places a bound on the cumulative rate of information change. Different paths across connecting two distributions can only lower so much, before requires a corresponding increase in uncertainty. The increase in uncertainty is caused by a larger deviation from the geodesic connecting the initial and final distributions.

Several cases are of immediate interest. In the steady state, we have, , since when . From Eq. (12), we get the special case that the rate of entropy production and entropy flow are in balance, . If the average rate of information change is constant, (i.e., the path is certain),

(20)

meaning the entropic acceleration is constant, independent of time, or distance from the final distribution.

Uncertainty scarring in a single-cycle chemical reaction.– To illustrate the theory, we adapt the kinetic scheme originally used by Onsager to demonstrate the reciprocal relations of irreversible thermodynamics Onsager (1931). The model consists of three states and a kinetics driven by the time-dependent rate coefficients, and , with and . The inverse tangent function ensures that as becomes large, every path reaches the same stationary distribution, , and we can localize the effects of the initial condition on the uncertainty about the nonequilibrium path. For a three-state system, we can visualize the path using the transformation, : the trajectory of will travel across the positive octant of a sphere. Fig. (1) shows the information-uncertainty ratio of starting from a sample of initial conditions, each of which evolves to the stationary distribution, (black circle). Our criterion for a path to “reach” the stationary distribution, is that each initial condition must evolve to be within of the stationary distribution. From our numerical testing, we find the ratio of and is satisfied. The lowest value of the ratio we found was .

The Fisher information over a manifold of probability distributions provides a well defined measure of distance, Eq. (13). What we find for this three-state system is that the distance from the stationary state says little about the uncertainty, the rate of information change, or their ratio, Fig. (1). Each initial condition starts with roughly the same ratio of information to uncertainty, Fig. (1a). Because the rate of information change and the variance are path dependent, the dynamics quickly generate a wide range of ratios, Fig. (1b). One prominent feature are the two (yellow) regions of initial conditions that have a large ratio relative to the rest of the state space. The time series of these initial conditions are shown in Fig. (2) (yellow), where we see that it is predominately the small uncertainty that is generating the large ratio. We also see that this large ratio disappears as the path approaches . Fig. (1c) shows the ratio for the (minimum) time it takes for all initial conditions to become within of . Also prominent is a “scar” where initial conditions quickly close in on at the cost of high cumulative entropic accelerations/change in information. Fig. (2) shows the time series for one of these initial conditions (green solid/dashed lines). To contrast paths originating from the scar, the blue lines in Fig. (2) correspond to an initial condition that is equidistant from . Yet, the path followed from this starting point has a lower and , showing that the shortest distance from the stationary point is a poor indicator of the behavior of the process.

Figure 2: Time profiles of (solid lines) and (dashed lines) for four different initial conditions. Colors correspond to the regions in Fig. (1). Profiles in blue (outside scar) and green (inside scar) are equidistant from the stationary state.

Conclusions.– Recently, there has been a resurgence of interest in thermodynamic uncertainty relations, particularly for systems far from equilibrium. These advances place upper bounds on the current fluctuations for processes in nonequilibrium steady states. Here, we showed the Fisher information can be directly equated to cumulative rate of information change. This rate is driven by the acceleration of the entropy production and entropy flow. Building on this connection, we quantified the uncertainty about the path connecting any two arbitrary distributions whose evolution is under a time-inhomogeneous Markovian dynamics, making it applicable to a broad class of nonequilibrium processes. It is clear that even for the most fundamental of kinetics, the classical single-cycle system, the proximity to the stationary state is a poor indicator of uncertainty. Initial conditions that are statistically equidistant from the stationary state can have dramatically different path uncertainties, uncertainties that we showed bound the cumulative rate of information change. We expect these results to be usefully applied to other far from equilibrium processes, such as (bio)chemical reactions Barato et al. (2014); Bo et al. (2015); McGrath et al. (2017), and further expand the understanding of processes away from equilibrium, both “near” and “far”.

Acknowledgments.– This material is based upon work supported by the U.S. Army Research Laboratory and the U.S. Army Research Office under grant number W911NF-14-1-0359. S. B. N. acknowledges financial support from the Office of Global Programs, University of Massachusetts Boston. We thank Sosuke Ito for pointing out the relation between the generalized forces and the matrix . We also thank Tamiki Komatsuzaki for valuable discussions and comments on an early version of the manuscript.

References

  • Callen (1985) H. B. Callen, Thermodynamics and an Introduction to Thermostatistics, 2nd ed. (John Wiley & Sons, Inc., 1985).
  • Mandelbrot (1956) B. Mandelbrot, IRE Transactions on Information Theory 2, 190 (1956).
  • Schlögl (1988) F. Schlögl, J. Phys. Chem. Solids 49, 679 (1988).
  • Uffink and van Lith (1999) J. Uffink and J. van Lith, Found. Phys. 29, 655 (1999).
  • Barato and Seifert (2015) A. C. Barato and U. Seifert, Phys. Rev. Lett. 114, 158101 (2015).
  • Gingrich et al. (2016) T. R. Gingrich, J. Horowitz, N. Perunov,  and J. England, Phys. Rev. Lett. 116, 120601 (2016).
  • Maes (2017) C. Maes, arXiv:1705.07412  (2017).
  • Horowitz and Gingrich (2017) J. M. Horowitz and T. R. Gingrich, Phys. Rev. E 96, 020103 (2017).
  • Pietzonka et al. (2017) P. Pietzonka, F. Ritort,  and U. Seifert, arXiv:1702.07699  (2017).
  • Shiraishi et al. (2016) N. Shiraishi, K. Saito,  and H. Tasaki, Phys. Rev. Lett. 117, 190601 (2016).
  • Shiraishi (2017) N. Shiraishi, arXiv:1706.00892  (2017).
  • Proesmans and Van den Broeck (2017) K. Proesmans and C. Van den Broeck, EPL 119, 20001 (2017).
  • Dechant and Sasa (2017) A. Dechant and S. Sasa, arXiv:1708.08653  (2017).
  • Jarzynski (1997) C. Jarzynski, Phys. Rev. Lett. 78, 2690 (1997).
  • Seifert (2012) U. Seifert, Rep. Prog. Phys. 75, 126001 (2012).
  • Kawai et al. (2007) R. Kawai, J. M. R. Parrondo,  and C. Van den Broeck, Phys. Rev. Lett. 98, 080602 (2007).
  • A. E. Allahverdyan and Mahler (2009) D. J. A. E. Allahverdyan and G. Mahler, J. Stat. Mech. 2009, P09011 (2009).
  • Hartich et al. (2014) D. Hartich, A. Barato,  and U. Seifert, J. Stat. Mech. 2014, P02016 (2014).
  • Horowitz and Esposito (2014) J. M. Horowitz and M. Esposito, Phys. Rev. X 4, 031015 (2014).
  • Yamamoto et al. (2016) S. Yamamoto, S. Ito, N. Shiraishi,  and T. Sagawa, Phys. Rev. E 94, 052121 (2016).
  • J. M. R. Parrondo and Sagawa (2015) J. M. H. J. M. R. Parrondo and T. Sagawa, Nat. Phys. 11, 131 (2015).
  • Boyd and Crutchfield (2016) A. B. Boyd and D. M. J. P. Crutchfield, New J. Phys. 18, 023049 (2016).
  • Cover and Thomas (2006) T. M. Cover and J. A. Thomas, Elements of Information theory, 2nd ed. (Wiley, 2006).
  • Amari and Nagaoka (2007) S. Amari and H. Nagaoka, Methods of information geometry, Vol. 191 (American Mathematical Soc., 2007).
  • Brody and Rivier (1995) D. Brody and N. Rivier, Phys. Rev. E 51, 1006 (1995).
  • Heseltine and Kim (2016) J. Heseltine and E. Kim, J. Phys. A 49, 175002 (2016).
  • Nicholson and Kim (2016a) S. B. Nicholson and E. Kim, Entropy 18, 258 (2016a).
  • Oizumi et al. (2016) M. Oizumi, N. Tsuchiya,  and S. Amari, Proc. Natl. Acad. Sci. U.S.A. 113, 14817 (2016).
  • Weinhold (1975) F. Weinhold, J. Chem. Phys. 63, 2479 (1975).
  • Crooks (2007) G. E. Crooks, Phys. Rev. Lett. 99 (2007).
  • Ruppeiner (1979) G. Ruppeiner, Phys. Rev. A. 20, 1608 (1979).
  • Sivak and Crooks (2012) D. A. Sivak and G. E. Crooks, Phys. Rev. Lett. 108, 190602 (2012).
  • Lahiri et al. (2016) S. Lahiri, J. Sohl-Dickstein,  and S. Ganguli, arXiv:1603.07758  (2016).
  • Schnakenberg (1976) J. Schnakenberg, Rev. Mod. Phys. 48, 571 (1976).
  • Rao (1945) R. C. Rao, J. Royal Statistics Society B10, 159 (1945).
  • Horn and Johnson (2012) R. A. Horn and C. R. Johnson, Matrix Analysis (Cambridge university press, 2012).
  • Nicholson et al. (2013) S. B. Nicholson, L. S. Schulman,  and E. Kim, Phys. Lett. A 377, 1810 (2013).
  • Nicholson and Kim (2016b) S. B. Nicholson and E. Kim, Physica Scripta 91, 044006 (2016b).
  • Frieden (2004) B. R. Frieden, Science from Fisher Information, Vol. 2 (Cambridge University Press, 2004).
  • Esposito and Van den Broeck (2010) M. Esposito and C. Van den Broeck, Phys. Rev. E 82, 011143 (2010).
  • Seifert (2005) U. Seifert, Phys. Rev. Lett. 95, 040602 (2005).
  • Shannon (1948) C. E. Shannon, The Bell System Technical Journal 27, 623 (1948).
  • Ito (2017) S. Ito, arXiv:1712.04311  (2017).
  • Wootters (1981) W. K. Wootters, Phys. Rev. D 23, 357 (1981).
  • Flynn et al. (2014) S. W. Flynn, H. C. Zhao,  and J. R. Green, J. Chem. Phys. 141, 104107 (2014).
  • Nichols et al. (2015) J. W. Nichols, S. W. Flynn,  and J. R. Green, J. Chem. Phys. 142, 064113 (2015).
  • Salamon and Berry (1983) P. Salamon and R. S. Berry, Phys. Rev. Lett. 51 (1983).
  • Onsager (1931) L. Onsager, Phys. Rev. 37, 405 (1931).
  • Barato et al. (2014) A. C. Barato, D. Hartich,  and U. Seifert, New J. Phys. 16, 103024 (2014).
  • Bo et al. (2015) S. Bo, M. D. Giudice,  and A. Celani, J. Stat. Mech. 2015, P01014 (2015).
  • McGrath et al. (2017) T. McGrath, N. S. Jones, P. R. ten Wolde,  and T. E. Ouldridge, Phys. Rev. Lett. 118, 028101 (2017).
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
219088
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description