Unstable Dynamics, Nonequilibrium Phases and Criticality in Networked Excitable Media

Unstable Dynamics, Nonequilibrium Phases and Criticality in Networked Excitable Media


Here we numerically study models of excitable media, namely, networks with occasionally quiet nodes and with connection weights that vary with activity on a short–time scale. The networks global activity show spontaneous (i.e., even in the absence of stimuli) unstable dynamics, nonequilibrium phases —including one in which the global activity wanders irregularly among attractors— and noise as the system falls into the most irregular behavior. A net result is resilience which results in an efficient search in the model attractors space that can explain the origin of similar behavior in neural, genetic and ill–condensed matter systems. By extensive computer simulation we also address a previously conjectured relation between observed power–law distributions and the possible occurrence of a “critical state” during functionality of (e.g.) cortical networks, and describe the precise nature of such criticality in the model.

PACS: 05.10.-a, 84.35.+i, 05.45.-a, 87.19.lj, 87.18.-h, 05.45.Gg

1 Introduction

A network is said to have attractors when it can autonomously change its pattern of overall activity to converge with time towards one case while being resilient to perturbations. Following psychological observations [1] and formal work by an engineer [2] and a physicist [3], the concept was popular two decades ago as a mathematical tool to explore the fundamentals of brain tasks attributed to cooperation between many neurons. According to the, say, standard model [4], patterns of information, corresponding to sets of values for the nodes activity, are stored in a way that affects the intensities of the edges, representing synapses, which induces a heterogeneous distribution of the edge weights. The global activity may then converge towards one of the given patterns when starting with a degraded version of it. That is, the system exhibits kind of resilience, often known as associative memory —a property that mimics the process of recognizing a childhood friend we have not seen for dozens of years— which being common to humans is difficult to be efficiently emulated with computers. Such a remarkable consequence of cooperation is also relevant to the understanding of complexity in a variety of systems and to solve actual optimization problems [2, 4, 5, 6, 7].

The systems of interest in nature do much more than just choosing one out of a set of patterns and staying in its neighborhood, however (see [8, 9, 10] and references therein). For example, signals from the heart and cortical neural activities have been successfully interpreted using non–linear dynamic theory [11]-[18], and the standard model has been generalized along biologically–motivated lines that endow it with even more interesting behavior [19]-[26]. In particular, it was shown that one may capture some of the observed shaky mechanisms and instabilities by taking into account two features that seem to characterize generically excitable media [27], namely, assuming both rapid activity–dependent fluctuations of the edge weights and the existence of nodes that are reluctant to a change of state during a time interval after operation. It is remarkable that incorporating these simple mechanisms into the standard model has allowed one to recreate [28] the transient dynamics of activity as observed in experiments concerning the locust odor representation [29].

The nervous system is definitely not the only network that exhibits both varying edge weights and silent nodes at a basic level of observation and, as a reflection of this at a higher level, roaming dynamics characterized by a continuous wandering among attractors. This occurs in ill–condensed matter, for instance, whose emerging properties are determined by “microscopic disorder”. In fact, it is sensible to imagine such a disorder is more involved than assumed in familiar spin glass models. That is, the effective interactions between ions should certainly be expected to have short time variations —associated to ion diffusion, basic chemical reactions, and other local changes concerning impurities, misfits, fields, rearrangements and strains, etc.— which would in general induce nonequilibrium patterns of activity as, for example, observed in reaction–diffusion systems [30, 31]. It is likely that the behavior of genetic networks during biological evolution is another case of microscopically–induced roaming dynamics [32, 33, 34]. Furthermore, though to our knowledge the relevance of roaming has not yet been described for other excitable systems, it is noticeable that variability of connections and occasional lack of individual activity are features that typically characterize friendship, social, professional and business contacts [35], the case of the interrelated metabolic reactions that run the cell, food webs, and transport and communication networked systems, for instance.

In this paper, we describe in detail model phenomenology bearing relevance to situations with spontaneously unstable dynamics associated to excitability as described in the two previous paragraphs. By extensive computer simulations, we show both first and second order phase transitions, characterize the nature of different nonequilibrium phases [30] that occur as one modifies the system parameters, study the details of the network activity dynamics, and determine the conditions in which long–range correlations and non–Gaussian noise emerge. This results in a systematic study that adds up to recent efforts trying to understand the origin of the observed relation between certain statistical criticality and dynamically critical functionality in neuroscience [9, 10, 36, 37, 38, 39, 40, 41]. Our study in this paper complements analytical study of the simplest limits of the same model in Ref.[27] and related exploratory numerical studies therein.

2 Definition of model

Consider a network in which the consequences of the activity changes of each node above threshold may be sketched by means of a binary variable: This is known to suffice in practice to investigate main effects of cooperation in different contexts [42]. Each node receives a signal —alternatively, endures a local field— where stands for the global activity and is the weight of the connection between nodes and In the problems of interest, one may typically single out patterns of activity, namely, with that have some special relevance. The weights then follow accordingly, e.g., by means of the superposition rule This is one of the simplest conditions that transforms the special patterns into attractors of dynamics [1, 4].

Short–time variability of connections will be introduced by assuming that their weights are given by where is a stochastic variable. In order to mimic the cases of interest, this variable should change very rapidly compared with the network characteristic time scale. Therefore, we shall assume it can be described by a stationary distribution. This is taken here as That is, with probability which in general depends on the global network activity, the weights are changed by a factor but remain unchanged otherwise. Depending on the value of this may simulate nodes excitability or potentiation or fatigue of the connections as a function of the degree of order in the system. The standard model corresponds to Other choices for have been investigated [43], including one in which the weights change depending on the degree of local order, and it seems that these details do not modify essentially the system behavior. We shall further assume for simplicity that the relevant probability in is a sort of order parameter, namely,


Here, is a vector whose components are the overlaps of the current state with each of the singularized patterns, namely,

Time evolution is a consequence of transitions that we performed with probability Here, is a parameter that measures the degree of stochasticity driving the evolution —the so–called network temperature. Another main parameter is the fraction, of nodes which is updated at each unit of time —the Monte Carlo (MC) step (per node). For simplicity, we shall assume here these nodes chosen at random from the whole set of In this way, the result is a situation in between the limits (sequential or Glauber updating) and (parallel or Little updating). The case of intermediate better corresponds to those situations in which due to excitability or other causes, e.g., power economy, not all the elements are active at all times.

For simplicity, we shall be concerned only with mutually orthogonal patterns. This is achieved in practice setting every node in for all equal to or independently with the same probability, so that for any in a large system. (Assuming specific sets of correlated patterns, which is of great practical interest, is beyond the scope of this paper that intentionally understates this model detail.) Then, under some restrictions which strictly require also the limit (see [44] for technical details), the conditions so far stated may be taken into account by assuming effective weights:


where the components of are We shall consider in the following this simplified version of our model which coincides with the general case for any after averaging over the stationary noise distribution As a matter of fact, (2) may formally be viewed as any learning prescription, which is affected by a multiplicative noise —with correlations built due to the dependence on Incidentally, connections that are roughly of this type were recently shown to induce sort of criticality in (neural) population dynamics [45].

3 Phases and diagrams

A main observation concerns the nature of the phases exhibited as one varies the noise parameter, the fraction of active nodes, the temperature and the load parameter It turns out convenient to monitor the time evolution of various order parameters [46, 47]; in particular,


where the asterisk is the value of that identifies the pattern having the largest squared overlap, and and stand, respectively, for averages over time and over independent realizations of the experiment (i.e., changing both the initial conditions and the set of the special, stored patterns). The set of the other overlaps, with may be characterized by:


where the sum is over all patterns excluding the one in Eq. (3). We also monitored the global activity by means of


Our values for and in the following involve sufficient averages of independent values to obtain smooth typical behavior, namely, from 200 to 1000 MCS and 50 to 100 systems for static values, and from 10000 to 50000 MCS and 10 systems for time–dependent values, unless indicated otherwise.

In the standard case for uncorrelated patterns, the system shows three phases [46, 4]:


Memory phase, in which the system evolves towards one of the given patterns —often known as pure or Mattis states. The stationary state corresponds to maximum overlap with the particular pattern, so that is large while is small in the stationary state, namely, One also has that near (This case is illustrated by the two top graphs in figure 1.)


Mixture phase, in which a large system converges to a mixture of pure states, so that it exhibits some order but not associative memory. Therefore, one may have several relatively large overlaps, which induces that with a lower bound —due to finite size— of order of while with a lower bound of order of Also, near


Disordered phase, in which the system remains completely disordered as dominated by thermal noise. Then, all the overlaps oscillate around zero, so that and is of order and in the stationary state.

These cases correspond, respectively, to the familiar ferromagnetic, spin glass and paramagnetic phases that are well characterized in studies of equilibrium magnetic models.

The behavior of our system is more complex than suggested by this picture, however. A main novelty for is that, as illustrated in figure 1,

Figure 1: The overlap functions showing typical different behaviors for nodes, patterns, noise parameter temperature and, from top to bottom: associative memory as in Ph at (a) (left) and (b) (right); irregular roaming among patterns at (c) (left) and (d) (right) as in Ph; eventual jumping between patterns after a set of oscillations between a pattern and its negative (antipattern) as in Ph at (e) (left); and pure pattern–antipattern oscillations as in Ph at (f)

the system exhibits different types of dynamic behavior that cannot be fitted to the above. That is, one observes that dynamics may eventually destabilize in such a way that quite irregular jumping —among attractors as well as from one pattern to its negative (antipattern)— occurs. The observed behavior suggests one to define the following dynamic scenarios, say, nonequilibrium phases that do not occur in the standard model:


Irregular roaming in which the activity keeps randomly visiting the basins of attraction corresponding to different patterns. (This is the case in figures 1(c) and 1(d), i.e., the two middle graphs in figure 1.)


Irregular roaming as for Ph but eventually interrupted at random during some time by oscillations between a pattern and its antipattern. (This occurs in figure 1(e).)


Pure pattern–antipattern oscillations. (As in figure 1(f).)

These three genuine nonequilibrium cases correspond to and (due to orthogonality). Case Ph also has (revealing the symmetry of oscillations), while both Ph and Ph have In order to properly characterize these dynamic cases, we shall monitor latter the statistics of the itinerant trajectory.

The different behaviors are better observed and interpreted at very low temperature. As shown in figure 2,

Figure 2: Nonequilibrium phase diagram at low temperature. This was obtained for and from detailed analysis of all the order parameter functions. The top (blue) line is for This leaves the equilibrium phases above, where Ph occurs with probability 0.87 and Ph otherwise. To the bottom, the next (violet) line —leaving also Ph above— is for The next (green) lines comprise an inverted–U shaped region in which The inset shows the roaming region in more detail.

the disordered phase Ph is not observed at the chosen (low) temperature, while the ordered, ferromagnetic and spin–glass phases then occur for any as far as is not too large. That is, one may have familiar order as in equilibrium —practically independently (over a wide range) of the noise affecting the connections— as far as only a relatively small fraction of nodes are simultaneously active [44]. However, one observes small fluctuations or dispersion with time around the mean value and that the amplitude of this kind of “error” increases as one lowers and increases This effect, which is evident when one compares the two top panels in figure 1, led us to indicate a zone Ph around the region for and It is worth to distinguish this zone which reveals how the ferromagnetic phase Ph has resilience, i.e., a remarkable stability of the attractor to large fluctuations. These increase monotonously with increasing and/or decreasing further and it finally results in jumping to other attractors (as in the two middle graphs in figure 1) when more than one half of the nodes are simultaneously active. This is the origin of the genuine nonequilibrium cases Ph, Ph and Ph. In fact, as shown in figure 2, one observes the onset of irregular roaming with and for and between and .

The above picture and figure 2 follow from a detailed combined analysis of functions and as illustrated in figure 3. This also shows that two main types of phase transitions between equilibrium and nonequilibrium phases occur (see figure 4).

Figure 3: (left) and (right; same axes but not shown for clarity) for and There is coexistence of Ph and Ph for while the latter phase does not show up for and memory then occurs but as Ph (see the main text) at sufficiently low

There is a second–order or continuous transition, as one maintains at a constant value, from the memory phase with large “error”, i.e., Ph, to the irregular roaming phase Ph. Then, at least near one also observes a first–order or discontinuous transition (figure 4),

Figure 4: Left: Second–order phase transiton between Ph and Ph around when Right: First–order phase transition between Ph and Ph around when Both plots are for and Note that different realizations using a different seed produce here different values corresponding to the different symbols; the mean of all the realizations is reperesented by a solid curve.

as is maintained constant, from the memory phase to the irregular roaming with pattern–antipattern oscillations, namely, Ph. Furthermore, it is noticeable here that, as illustrated in figure 5,

Figure 5: The second–order phase transition on the left of figure 4. For the same system as in this figure, the main graph here shows data for and 3200 and 6400, respectively from left to right in the middle of the value. The inset is for the same values of but 10 and 20, respectively, i.e., same value of

the transition region depends on the value of that is, the critical value of increases somewhat with decreasing for finite , and it seems to go to as for finite

The rare shape of the roaming region in plane for which shows in detail the inset of figure 2, is roughly the same as the one obtained analytically when for the change of sign of the Lyapunov exponent in a closely related model (figure 2 in Ref.[27]). This confirms the general observation during our MC experiments of kind of chaos within the inverted–U region which is delimited in figure 2 by the green lines. That is, one should endow a chaotic character to the roaming region. That similarity also reinforces the reliability of our measures of order, and it shows how robust the model here is in relation to the dynamically irregular behavior. It also follows, in particular, that the model parameter is irrelevant to this qualitative behavior, at least as far as not too many patterns are stored.

The “phases” Ph and Ph, e.g., cases (d) and (e) in figure 1, cannot be discriminated on the basis of and only. The top panel in figure 6

Figure 6: Upper panel: Functions , and for and Bottom panel: Time series for the overlap functions in the same case. The value of is increased here during time evolution as indicated by the horizontal axis in the upper panel. Different colours correspond in this graph to different values of

illustrates how these functions change with for fixed at low temperature. The bottom panel illustrates the dynamic transition from irregular roaming in Ph to the more regular behavior in Ph as a consequence of increasing the amplitude of fluctuations around the attractor as the fraction of active nodes is increased during time evolution. As indicated in figure 2, the separation between the memory phase Ph or Ph and the nonequilibrium cases is clear cut, while again it results more difficult to discriminate numerically the region Ph of pure pattern–antipattern oscillations (where out of the PhPh chaotic region (where with In any case, however, our finding concerning this agrees with the analytical result in a related case [27].

4 The onset of irregularity

The above shows that the most intriguing behavior is when the system activity becomes irregular, e.g., as one crosses the second–order transition from the memory phase region to the nonequilibrium behavior —either at Ph with irregular roaming among attractors or at Ph where this may be randomly interrupted by series of pattern-antipattern oscillations. Figure 7

Figure 7: The local signal or field on a typical neuron (left panels) and five overlaps (right panels) indicated with different colours for a system with and, respectively from top to bottom, (near the transition point), and .

illustrates an aspect of this transition. In addition to the time evolution of some of the overlaps (right panels), which indicates where the activity is at each moment, this shows (left panels) the signal that can sometimes be monitored in experiments. As a matter of fact, this may be compared, for instance, with electrical signals measured in single neurons —as well as more delocalized, local fields— in the cortex and hippocampus of rats [48], and with MEG signals and recordings for single neuron action potentials [49, 50].

It thus seems it would be most interesting to characterize more quantitatively how the model signal transforms while performing the relevant transitions. That is, when moving from the case of random fluctuations around a constant value in the memory phase, to the case in which the amplitude of the fluctuations increases and eventually switches to the negative of the original value, and finally reaches the case in which the frequency of switching and all the other variables become fully irregular in Ph and Ph. With this aim, we studied in detail the distribution of times of permanence in an interval around significative values of More specifically, in order to extract the relevant information in the case of quite different signals such as those in figure 7, it turned out convenient to compute the distribution of time intervals, say in which the signal continuously stays in any of two ranges either or The cutoff intends to suppress the smallest fluctuations, which correspond to non–significative noise; this is achieved here in practice for We thus observe, after averaging over the network, time and different experiments that the interesting behavior requires relatively large systems, so that it does not occur for, say, and while it already becomes evident for, e.g., and The most interesting fact from this study is that the exponent in a power–law fit monotonously increases with size from for and in a way that might indicate a tendency of to 1.5–2 (though our data never reached this regime). These facts are illustrated in the following figures.

The left panel in figure 8

Figure 8: Logarithmic plots. Left: Distribution of time intervals in which the signal continuously stays in any of the two ranges either or with when and for the sub–critical cases (a) —a practically horizontal signal in the Ph phase— and (b), the super–critical cases (d) and (e) —an exponential behavior in the Ph phase—, and the near–critical case (c). The latter, near–critical case approximately follows the dotted line with for a large time interval. Each case corresponds to an average over 50 neurons and 20 independent systems running for 10 MCS. Right: Power spectra of for the same cases as in the left pannel using runs with 4 MCS. The power–law is illustrated with a dotted line.

shows a changeover from a general exponential behavior to a power–law behavior near the interesting second–order phase transition. Analysis of the Fourier spectra reveals a similar situation, i.e., changeover from exponential to power–law behavior, concerning both the signal (right pannel in figure 8) and the overlap function Figure 8 is a definite evidence for statistical criticality as one approaches the relevant transition. On the other hand, figure 9

Figure 9: The same as in fig.8 to show the effect of varying the size at fixed and From bottom to top, the data —corresponding to an average over 50 neurons and 10 independent systems— are for and 3.5 MCS (red), and 6 MCS (green), and and 8 MCS (blue), respectively. (For clarity purposes, there is a vertical translation of the data points, and it was set here.) Both the exponent in as well as the cutoff at which this power law fails clearly increase as is increased.

shows how the system activity close to the transition between the memory equilibrium phase Ph and the irregular behavior in Ph tends to follow the power law distribution over a larger range as one increases the size for fixed which decreases . However, we observed (not shown) that does not depend on namely, the same value is obtained when for 3200 and 6400.

5 Final discussion

Chemical reactions diffusing on a surface, forest fires with constant ignition of trees, parts of the nervous system vigorously reacting to weak stimuli, and the heart enduring tachycardia are paradigms of excitable systems —out of many cases in mathematics, physics, chemistry and biology; see [51, 52], for instance. Despite obvious differences, these systems share some characteristics. They comprise spatially distributed “excitable” units connected to each other and cooperating to allow for the propagation of signals without being gradually damped by friction. The simplest realization of the relevant excitability consists in assuming that each element has a threshold and a refractory time between consecutive responses. In order to deal with a setting which is both realistic and mathematically convenient, one may suppose the system is networked with occasionally quiet nodes and connection weights that vary with activity on short–time scales. As a matter of fact, experimental observations reveal rest states stable against small perturbations, which correspond to the silent nodes here, and rapid varying strength of connections, either facilitating or impeding transmission, which temporarily affect thresholds and may also induce time lags during response. Furthermore, it is known that such nonequilibrium setting induces dynamic instabilities and attractors [27, 43]. On the other hand, we believe it is likely that this modelling of excitable media may in fact be related to the one by means of partial differential equations such as when the simple FitzHugh–Nagumo model [53] is used to represent each unit.

With this motivation, we have studied excitable media by extensive computer simulations of a discrete time model with an updated rule which generalizes the Hopfield–like standard case. The resulting phenomenology as described here is expected to describe the basic behavior in a number of apparently diverse man–made and natural excitable systems. In particular, we explicitly show how the model exhibits in the absence of stimuli highly unstable dynamics when a sufficiently large fraction of nodes are synchronized and for certain values of a noise parameter that controls the noise within the connections strength. We also illustrate how these instabilities induce the occurrence of novel, first– and second–order nonequilibrium phases. One of these happens to be most interesting as it describes the global activity wandering irregularly among a number of attractors, details strongly depending on the values of and In particular, one may tune an efficient search in the model attractors space which is sensible to assume it may be at the origin of phenomenology previously described for neural, genetic and ill–condensed matter systems. There is also definite evidence of non–Gaussian, noise when the system is tuned into this irregular behavior, which may explain recent experimental observations of criticality and power–law distributions in cortical networks.

Finally, we remark how the mechanism behind the irregular jumping from one pattern to the other is well understood in the model. That is, the relevant instabilities are to be directly associated to the effective local fields that one may write as


for large i.e., neglecting terms of order After some manipulation, one may write this more explicitly as


Here, stands for the energy per neuron in the standard model, and the last sum is over all pairs of different indexes As discussed above, tends to drive the system activity near the attractor associated to one of the stored patterns. Together with the second term in Eq. (7), this sums up to which, depending on the value of induces instabilities and irregular behavior of the overlaps dynamics similar to those in a cubic map [54]. The third term in (7), on the other hand, may be written as with Given that differs from here, this only includes asymmetric terms similar to those that characterize the local fields for asymmetric learning rules, namely, which are often used to stored and retrieve ordered sequences of patterns [55, 47]. It is sensible to assume, therefore, that this term is most efficient in the present case in inducing transitions among patterns. Unlike for asymmetric learning [55], however, the destabilization here does not induce any order nor predictability in the sequence of visited patterns.

This work was supported by the Junta de Andalucía project FQM–01505, and by the Spanish MICINN–FEDER project FIS2009–08451.


  1. D.O. Hebb, The organization of behaviour: a neurophysiological theory, Wiley, New York 1949.
  2. S. Amari, “Learning patterns and pattern sequences by self-organizing nets of threshold elements”, IEEE Trans. on Computers 21, 1197 (1972)
  3. J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities”, Proc. Natl. Acad. Sci. U.S.A. 79, 2554 (1982)
  4. D.J. Amit, Modeling brain function: The world of attractor neural networks, Cambridge Univ. Press, Cambridge 1989.
  5. J. Marro, P.L. Garrido, and J.J. Torres, “Effect of Correlated Fluctuations of Synapses in the Performance of Neural Networks”, Phys. Rev. Lett. 81, 2827 (1998)
  6. J.M. Cortes, P.L. Garrido, H.J. Kappen, J. Marro, C. Morillas, D. Navidad, and J.J. Torres, “Algorithms for Identification and Categorization”, AIP Proc. Series 779, 178 (2005)
  7. T.J. Wills, C. Lever, F. Cacucci, N. Burgess, and J. O’Keefe, “Attractor Dynamics in the Hippocampal Representation of the Local Environment”, Science, 308, 873 (2005)
  8. M. Rabinovich, R. Huerta, and G. Laurent, “Transient dynamics for neural processing”, Science 321, 48 (2008)
  9. M.O. Magnasco, O. Piro, and G.A. Cecchi, “Self-Tuned Critical Anti–Hebbian Networks”, Phys. Rev. Lett. 102, 258102 (2009)
  10. T. Petermann, T.C. Thiagarajan, M. Lebedev, M. Nicolelis, D.R. Chialvo, and D. Plenz, “Spontaneous cortical activity in awake monkeys composed of neural avalanches”, PNAS 106, 15921 (2009)
  11. E. Thelen and L.B. Smith, A Dynamic Systems Approach to the Development of Cognition and Action, The MIT Press, Cambridge MA 1994.
  12. W.J. Freeman, “A proposed name for aperiodic brain activity: stochastic chaos”, Neural Nets. 13, 11 (2000)
  13. H. Korn and P. Faure, “Is there chaos in the brain? II. Experimental evidence and related models”, C. R. Biol. 326, 787 (2003)
  14. K. Kaneko and I. Tsuda, “Chaotic itinerancy”, Chaos 13, 926 (2003)
  15. J.S. Perkimki, T.H. Mkikallio, and H.V. Huikuri, “Fractal and complexity measures of heart rate variability”, Clinical and Experimental Hypertension 27, 149 (2005)
  16. M. Shen, G. Chang, S. Wang, and P.J. Beadle, “Nonlinear Dynamics of EEG Signal Based on Coupled Network Lattice Model”, LNCS 3973, 560 (2006)
  17. L.M. Jones, A. Fontanini, B. F. Sadacca, P. Miller, and D. B. Katz, “Natural stimuli evoke dynamic sequences of states in sensory cortical ensembles”, Proc. Natl. Acad. Sci. USA 104, 18772 (2007)
  18. M.I. Rabinovich and M.K. Muezzinoglu, “Nonlinear dynamics of the brain: emotion and cognition”, Physics – Uspekhi 53, 357 (2010)
  19. H. Sompolinsky, A. Crisanti, and H.J. Sommers, “Chaos in random neural networks”, Phys. Rev. Lett. 61, 259 (1988)
  20. M. Adachi and K. Aihara, “Associative dynamics in a chaotic neural network”, Neural Nets. 10, 83 (1997)
  21. H. Bersini and P. Sener, “The connections between the frustrated chaos and the intermittency chaos in small Hopfield networks”, Neural Nets. 15, 1197 (2002)
  22. H. Lu, “Chaotic attractors in delayed neural networks”, Phys. Lett. A 298, 109 (2002)
  23. X.S. Yang and Q. Yuan, “Chaos and transient chaos in simple Hopfield neural networks”, Neurocomputing 69, 232 (2005)
  24. W. Freeman and M.D. Holmes, “Metastability, instability, and state transitions in neocortex”, Neural Nets. 18, 497 (2005)
  25. C. Molter, U. Salihoglu, and H. Bersini, “The road to chaos by time–asymmetric Hebbian learning in recurrent neural networks”, Neural Comp. 19, 80 (2007)
  26. W.-Z. Huang and Y. Huang, “Chaos of a new class of Hopfield neural networks”, Applied Math. and Computation 206, 1 (2008)
  27. J. Marro, J.J. Torres, and J.M. Cortes, “Complex behaviour in a network with time-dependent connections and silent nodes”, J. Stat. Mech. P02017 (2008)
  28. J.J. Torres, J. Marro, J.M. Cortes, and B. Wemmenhove, “Instabilities in attractor networks with fast synaptic fluctuations and partial updating of the neurons activity”, Neural Nets. 21, 1272 (2008)
  29. O. Mazor and G. Laurent, “Transient dynamics versus fixed points in odor representations by locust antennal lobe projection neurons”, Neuron 48, 661 (2005)
  30. J. Marro and R. Dickman, Nonequilibrium phase transitions and critical phenomena in lattice systems, Cambridge Univ. Press, Cambridge 2005.
  31. J.J. Torres, P.L. Garrido and J. Marro, “Modeling Ionic Diffusion in Magnetic Systems”, Phys. Rev. B 58, 11488 (1998)
  32. Y. Bar-Yam, D. Harmon, and B. de Bivort, “Attractors and Democratic Dynamics”, Science 323, 1016 (2009)
  33. O. Hallatschek and D.R. Nelson, “Population genetics and range expansions”, Phys. Today, July, 42 (2009)
  34. A. Diaz–Guilera and E.R. Alvarez–Buylla, “Complexity of Boolean dynamics in simple models of signaling networks and in real genetic networks”, in Handbook on Biological Networks, edited by S. Boccaletti, V. Latora, and Y. Moreno, World Scientific Lecture Notes in Complex Systems, vol. 10, World Scientific 2009.
  35. There are models suggesting such behavior in a sociological context; see J.T. Sun, S.J. Wang, Z.G. Huang, L. Yang, Y. Do, and Y.H. Wang, “Effect of information transmission on cooperative behavior”, New J. of Physics 12, 063034 (2010) and references therein.
  36. D.R. Chialvo, “Critical brain networks” Physica A340 756 (2004)
  37. V.M. Eguíluz, D.R. Chialvo, G.A. Cecchi, M. Baliki, and A.V. Apkarian, “Scale-free brain functional networks”, Phys. Rev. Lett. 94, 018102 (2005)
  38. W.J. Freeman, M.D. Holmes, A. West, and S. Vanhatalo, “Fine spatiotemporal structure of phase in human intracranial EEG”, Clinical Neurophys. 117, 1228 (2006)
  39. D.R. Chialvo, P. Balenzuela and D. Fraiman, “The Brain: What is critical about it?”, Collective Dynamics: Topics on Competition and Cooperation in the Biosciences, AIP 1028, 28 (2008)
  40. J.A. Bonachela, S. de Franciscis, J.J. Torres, and M.A. Muñoz,“Self–organization without conservation: are neuronal avalanches generically critical?, J. Stat. Mech. P02015 (2010)
  41. D. Plenz and D.R. Chialvo, “Scaling properties of neuronal avalanches are consistent with critical dynamics”, to be published (2010) (arxiv.org/ftp/arxiv/papers/0912/0912.5369.pdf)
  42. J.J. Hopfield, “Neurons with graded response have collective computational properties like those of two–state neurons”, Proc. Natl. Acad. Sci. USA 81, 3088 (1984). See also M.E.J. Newman, “The structure and function on complex networks”, SIAM Reviews 45, 167 (2003); S. Boccaletti, V. Latora, Y. Moreno, M. Chavez, D.U. Hwang, “Complex Networks: Structure and Dynamics”, Phys. Rep. 424, 175 (2006)
  43. S. Johnson, J. Marro, and J.J. Torres, “Functional optimization in complex excitable networks”, Europhys. Lett. 83, 46006 (2008)
  44. J.M. Cortes, J.J. Torres, J. Marro, P.L. Garrido, and H.J. Kappen, “Effect of fast presynaptic noise in attractor neural networks”, Neural Comp. 18, 614 (2006)
  45. J.F. Mejias, H.J. Kappen, and J.J. Torres, “Irregular dynamics in up and down cortical states”, to be published (2010); arxiv.org/abs/1007.3556
  46. D. J. Amit, H. Gutfreund, and H. Sompolinsky, “Storing infinite numbers of patterns in a spin-glass model of neural networks”, Phys. Rev. Lett. 55, 1530 (1985); ibid, “Spin-glass models of neural networks”, Phys. Rev. A 32, 1007 (1985)
  47. M. Griniasty, M. V. Tsodyks, D. J. Amit, “Conversion of temporal correlations between stimuli to spatial correlation between attractors”, Neural Comp. 5, 1-17 (1993)
  48. J. Csicsvari, D.A. Henze, B. Jamieson, K.D. Harris, A. Sirota, P. Baethó, K.D. Wise, and G. Buzsáki, J. Neurophysiol. 90, 1314 (2003)
  49. M. Hamalainen, R. Hari, R.J. Ilmoniemi, J. Knuutila, and O.V. Lounasmas, Rev. Mod. Phys. 65, 413 (1993)
  50. See, however, Ref.[18] for an important concern on temporal and spatial resolution.
  51. D. Kaplan and L. Glass, Understanding Nonlinear Dynamics, Springer, New York 1995.
  52. E.M. Izhikevich, Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, MIT Press,Cambridge, Mass. 2007.
  53. R. FitzHugh, “Mathematical models of threshold phenomena in the nerve membrane”, Bull. Math. Biophys., 17, 257 (1955); ibid, “Pulses and physiological states in theoretical models of nerve membrane”, Biophys. J., 1, 445 (1961); J. Nagumo, S. Yoshizawa, and S. Arimoto, IEEE Trans. Commun. Technol., 12, 400 (1965).
  54. See, for instance, T.D. Rogers and D.C. Whitley, “Chaos in the cubic mapping”, Mathematical Modeling 4, 9 (1983)
  55. See, for instance, J. Hertz, A. Krogh, and R.G. Palmer, Introduction to the theory of Neural Computation, Addison-Wesley 1991.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description