The Cavity Approach to Parallel Dynamics of Ising Spins on a Graph
We use the cavity method to study parallel dynamics of disordered Ising models on a graph. In particular, we derive a set of recursive equations in single site probabilities of paths propagating along the edges of the graph. These equations are analogous to the cavity equations for equilibrium models and are exact on a tree.
On graphs with exclusively directed edges we find an exact expression for the stationary distribution of the spins. We present the phase diagrams for an Ising model on an asymmetric Bethe lattice and for a neural network with Hebbian interactions on an asymmetric scale-free graph.
For graphs with a nonzero fraction of symmetric edges the equations can be solved for a finite number of time steps. Theoretical predictions are confirmed by simulation results.
Using a heuristic method, the cavity equations are extended to a set of equations that determine the marginals of the stationary distribution of Ising models on graphs with a nonzero fraction of symmetric edges. The results of this method are discussed and compared with simulations.
Many problems in different research fields are based upon the interaction of units through some underlying graph. Some examples are: gene expressions in boolean networks , agents competing for some limited resources , interactions between the decoding variables in low-density parity check codes , interactions between humans on a social network , the analysis of phase transitions of a spin glass .
To calculate statistical quantities on a given graph instance, one can use the cavity method . This method is based on the assumption that for sparse graphs the neighbouring spins only depend on each other through their direct interactions. A similar method, known as the sum-product algorithm, is used in information theory and artificial intelligence, see  for a tutorial paper. Examples of problems investigated with the cavity method are: the characterisation of the set of solutions of optimisation problems on random graphs , the calculation of the eigenvalue spectrum of sparse random matrices  and the solution of the minimum weight Steiner tree problem .
For many problems the stationary distribution of the spins is not known, e.g., neural networks with asymmetric couplings , the minority game , ... One would like to generalize the cavity method to treat the internal dynamics of these models. One of the standard approaches here is the generating functional analysis first discussed in . The stationary solution of the minority game was found using this method . In work in progress  the parallel dynamics on graphs is studied using the generating functional method of . The same authors used this method to analyze the evolution of a decoding algorithm on a sparse graph . Another successful method is the dynamical replica analysis which was applied on finitely connected systems . Recently, sequential dynamics of an Ising spin glass on a Bethe lattice with binary couplings was solved using a cavity-like approach .
In this work we apply the cavity method to solve the parallel dynamics of models on random graphs. The aim thereby is twofold. First, we want to generalize the effective dynamical equations that solve the dynamics on a Poissonian graph, given in , to random graphs with a given degree distribution. This generalisation is important when we want to solve the dynamics of models on a given graph structure. We use these equations to solve the dynamics and to find the stationary distribution of an Ising model and a neural network model, both on a fully asymmetric graph. We discuss how the correlations in the indegrees and outdegrees influence the performance of the neural network. The second purpose of the paper is to extend the cavity equations to graphs with both symmetric and asymmetric couplings. For this we need to find the stationary state of the dynamics. This is possible when we neglect the correlations in time of the stationary distribution. We discuss how good this approach can predict macroscopic observables of Ising models with bond disorder or with fluctuating connectivities.
This paper is organized as follows: In Section 2 we define the necessary quantities and derive the effective dynamical equations for the single site marginals on a given graph instance. We take the average of these over a graph ensemble in Section 3. In section Section 4 we specify the dynamics of the spin models. We derive the equations for the distributions of single site marginals in Section 5. The evolution of macroscopic observables obtained from the theory is compared with simulations in Section 6. We discuss the phase diagrams for an Ising model on an asymmetric Bethe lattice and for a neural network with Hebbian interactions on a scale-free graph in sections Section 7 and Section 8 respectively. In Section 9 we derive an algorithm that calculates the single site marginals of the stationary distribution on graphs with arbitrary symmetry. A discussion is given in Section 10.
2Dynamics on a given graph instance
2.1Some Definitions and Notations
We consider models defined on a given graph instance , with and respectively the set of vertices (or sites) and the set of edges. We limit ourselves to simple directed graphs determined by a connectivity matrix C, with elements C. When and the graph has a directed edge from the -th site to the -th site. When there is an undirected edge between and and when there are no edges between them. We define the sets , and through: , and . We study the evolution of Ising like models of -replicated Ising variables , with and the corresponding discrete time step. The dynamics in discrete time is defined by a transition probability , from the state on the -th time step to the state on the -th time step. We consider transition probabilities of the form:
The -dimensional local field is defined through
where the field quantifies the influence of the spin on site on the spin on site and is an external field. We used for the neighbourhood of all the vertices that influence directly, i.e. . We will also use: , and . The probability to have the path , from time step to time step , is given by
with the probability distribution of the spins at time step .
2.2Dynamical Version of the Cavity Equations
Using the cavity method, see , it is possible to solve the parallel dynamics on graphs. The cavity graph is the subgraph of where the -th vertex and all of the interactions with its neighbours are removed. We write the following relationship between a path probability on the graph and the probability on its related cavity graph
In equation (Equation 3) we introduced an extra field , representing the influence of the -th spin on its neighbours :
The prefactor determines whether the edge is symmetric or not: for undirected edges and for directed edges. We took a factorised initial distribution : . The single site marginal is obtained by summing in (Equation 3) over all paths with . In general, we will use the notations
with a set of indices: , where denotes the size of the set . Within this notation is the joint probability of the paths on the neighbours of . When we sum over all paths , with , on the left hand and right hand side of (Equation 3), we get
In the sequel we drop the subscript in the argument of . Now we make the Bethe-Peierls approximation: i.e. we assume that the spins in the neighbourhood of become independent when we remove the -th spin:
for the path probability on the graph , with . To derive (Equation 6) we used . The set of -equations (Equation 6) determines the -probability distributions at time step as a function of the -probability distributions at the previous time step . In equation (Equation 6) we only need to take the product over because the fields depend only on . We call the equations (Equation 6) the dynamical cavity equations analogous to the static equations (Equation 84). The main difference is that (Equation 6) are recursive equations of probabilities of paths propagating along the graph, while in (Equation 84) messages are propagated that determine the marginal probabilities of the stationary distribution. Just like the static equations, the set of equations (Equation 6) is exact on a tree. To find the marginal distributions on the original graph from the cavity distributions, we need to combine equations (Equation 4) and (Equation 5):
Equations (Equation 7) are the dynamical versions of the set of equations (Equation 85). The initial problem of finding the single site marginals from the -site probability has a computational complexity . The set of equations (Equation 6) and (Equation 7) has a linear complexity in the system size and an exponential complexity in time which makes the dynamics solvable for a finite number of time steps.
The cavity equations simplify a lot when the graph is fully asymmetric. In this case we can set in equation (Equation 6). Therefore, the equations only have to be solved for , where is the null vector. Moreover, because the self-coupling disappears in (Equation 6). We can thus sum on the left and right hand side of (Equation 6) over to get
Equation (Equation 8) describes a Markovian dynamics.
3The Ensemble Averaged Distribution of Paths
We calculate the average of equation (Equation 6) over all links in the graph, i.e. all . The graph is drawn from an ensemble of graphs . We look at ensembles where the typical graphs have a local tree structure and the degrees on different sites are uncorrelated. An example is the Poissonian ensemble defined in (Equation 73) of Appendix A. The degree distribution is defined through a histogram as
In equation (Equation 9) we use the following notations: the indegree , the outdegree and the symmetric degree . For the dynamics of Ising models on typical graphs drawn from such ensembles depends on the degree distribution (Equation 9). We define as the average of the path probabilities over all directed edges of :
The average probability mass function is defined as the average of over all links belonging to an undirected edge
When we use the property that the spins in the neighbourhood of are uncorrelated, we can write
It is useful to focus on a specific example. We consider fields of the type , where the interactions strengths are i.i.d.r.v. drawn from a distribution . When we take the average of the update equations (Equation 6) according to the definitions (Equation 10) and (Equation 11), and use (Equation 12) we find the recursive equations for the averaged probability mass function of paths. These recursive equations are given by:
We introduced the average connectivities and .
The averaged probability mass function over the marginals , defined through , can be calculated from (Equation 7):
The Markovian dynamics of spins defined in (Equation 1) is thus reduced to an effective non-Markovian dynamics of one single spin given by the recursive equations (Equation 13), (Equation 14) and (Equation 15). Equations analogous to (Equation 13) and (Equation 14) were derived in  in the context of LDGM channel coding using the generating functional analysis.
For fully asymmetric graphs, see (Equation 8), we remark that , but the averages, and , over, respectively, the links and the sites are different. Indeed:
with and .
4Examples of Dynamics
In this section we define the type of dynamics we study by specifying the form of the transition probabilities used in equation (Equation 1).
We consider Glauber dynamics for an Ising model with , i.e. . Every spin evolves under the influence of the field with a transition probability defined through:
The parameter is the inverse of the temperature . It is possible to implement the dynamics defined by (Equation 17) and (Equation 1) with the heat-bath algorithm . When the graph is fully symmetric detailed balance is satisfied and the Hamiltonian is given by equation (Equation 78).
The parameter controls the number of edges while controls the fraction of symmetric edges in the graph. The are Kronecker delta functions. For , the typical degree distribution of a graph drawn from this ensemble, , is given by:
For the Poissonian ensemble equation (Equation 77) is valid such that . We find equation (Equation 20) which is identical to the main result of reference  derived by calculating the generating function. Hence, we conclude that the equations (Equation 13), (Equation 14) and (Equation 15) are consistent with the results found in .
We define a dynamics of two sets of spins under the influence of the same thermal noise. We thus have , i.e. the dynamic variables are . The spins feel only the influence of their neighbouring spins , with , through, respectively, the fields and . The fields and depend, respectively, only on the sets and . The spins evolve according to :
where is the Heaviside step function and the weights and are given by
Equation (Equation 21) can be simulated using a heat-bath algorithm where at each time step we choose the same random numbers for both set of spins and . A more compact form of is:
When the thermal average of the distance between the paths and does not converge to zero for , even when the initial distance between and is very small, the system is in a chaotic phase. We use the transition probability to determine the phase transitions to this chaotic phase. Chaotic behaviour has been studied in  for spin glasses and in  for neural networks. The coupled dynamics (Equation 22) can not satisfy detailed balance.
5The Path Entropy and the Distribution of the Probability Distributions of Paths
The fluctuations of the path probabilities over all links are given by the distribution of the probabilities of the paths which we will call . They determine quantities like the average path entropy . On the basis of the recursive equations for the distributions we discuss in Section 9 the stationary solutions of the dynamics.
The average path entropy is defined as
where the bar denotes the average over the quenched variables. With the cavity method , we can write
The quantity is the increment in the entropy when the -th site is added to the graph :
The quantity is minus the entropy difference when remove the link from the graph
The summation over the sites in equation (Equation 24) can be done when we know the distributions of the probabilities of paths on the graph.
We define the following distributions