Mean-field dynamics of a population of stochastic map neurons

Mean-field dynamics of a population of stochastic map neurons

Igor Franović franovic@ipb.ac.rs Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia    Oleg V. Maslennikov olmaov@ipfran.ru Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia    Iva Bačić Scientific Computing Laboratory, Center for the Study of Complex Systems, Institute of Physics Belgrade, University of Belgrade, Pregrevica 118, 11080 Belgrade, Serbia    Vladimir I. Nekorkin Institute of Applied Physics of the Russian Academy of Sciences, 46 Ulyanov Street, 603950 Nizhny Novgorod, Russia
July 3, 2019
Abstract

We analyze the emergent regimes and the stimulus-response relationship of a population of noisy map neurons by means of a mean-field model, derived within the framework of cumulant approach complemented by the Gaussian closure hypothesis. It is demonstrated that the mean-field model can qualitatively account for stability and bifurcations of the exact system, capturing all the generic forms of collective behavior, including macroscopic excitability, subthreshold oscillations, periodic or chaotic spiking and chaotic bursting dynamics. Apart from qualitative analogies, we find a substantial quantitative agreement between the exact and the approximate system, as reflected in matching of the parameter domains admitting the different dynamical regimes, as well as the characteristic properties of the associated time series. The effective model is further shown to reproduce with sufficient accuracy the phase response curves of the exact system and the assembly’s response to external stimulation of finite amplitude and duration.

pacs:
05.45.Xt, 89.75.Fb, 05.40.Ca

Cortical connectivity patterns exhibit a hierarchical modular organization from the microscopic level of interacting neurons, via cortical columns and other types of mesoscopic circuitry, up to fibers projecting between the distributed brain areas ZZHK06 (); BS09 (); MLB10 (). Architecture of neural assemblies comprising the anatomical modules closely reflects the functional specialization over different modalities SCKH04 (), such that the macroscopic dynamics of neuronal populations, as well as the interplay of the associated collective modes, underpin various stages of information processing and higher cognitive functions MPS10 (). In terms of evoked activity in the cortex, an ample example concerns the sensory regions, where many neurons display similar responses to a given stimulus, which strongly indicates that the information content is primarily encoded by the collective assembly response ALP06 (). With regard to self-organized dynamics, the different forms of synchronization between spiking or bursting neurons give rise to macroscopic oscillations known to span several orders of magnitude in frequency range VLRM01 (); B09 (). Such oscillations are deemed to facilitate coordinated rhythmic tasks and provide the dynamical background behind perception, attention, memory consolidation, motor planning or sleep VLRM01 (); B09 (); VW09 (). Nonetheless, interaction between multiple rhythms has been indicated as critical for merging the activities of distant brain regions through cross-frequency coupling and other mechanisms CS06 (); LJ13 (). For these reasons, gaining a deeper understanding of macroscopic behavior of neural populations has become an outstanding issue in neuroscience, focused primarily on the mechanisms guiding the emergent collective dynamics and the fashion in which assemblies respond to various external stimuli. Conceptually, such an approach is reminiscent of a common paradigm in nonlinear dynamics, where systems of coupled oscillators exhibiting a collective mode are typically treated as macroscopic oscillators, which may then be subjected to an external drive or can be influenced by collective rhythms from afferent populations BRZK09 ().

In the present paper, we systematically analyze the emergent dynamics and the stimulus-response relationship of a population of stochastic map neurons using a mean-field () approach. The considered map neurons can exhibit a variety of regimes, including excitability, subthreshold oscillations, regular and chaotic spiking or bursting, as well as mixed spiking-bursting oscillations MN14 (); MKRN13 (); MNK15 (); MN15 (). Despite that the collective motion of spiking or bursting neurons subjected to noise has been extensively studied using different models of discrete local dynamics, such as Rulkov R02 (); RTB04 (); WL07 (); WDPC08 (); BPVL07 (); ICS11 (); FM10 (); FM11 () or Izhikevich neuron maps I06 (); IE08 (), a theory for a population of stochastic map neurons is obtained here for the first time. Nevertheless, in case of continuous time systems, the approach has been a standard analytical tool for treating diverse problems in neuroscience and other fields FB05 (); LTGE02 (); B10 (); BC07 (); BH99 (); H03 (). Note that our derivation of the effective model relies on Gaussian approximation, which is introduced within the framework of a Gaussian closure hypothesis LGNS04 (); FTVB14 (); FTVB13 (); FTVB12 (); KF15 (); ZSSN05 (); SZNS13 ().

The particular set of issues we address consists in establishing whether and how the model can be used to qualitatively analyze the network stability and bifurcations of the exact system associated to emergence of generic macroscopic regimes; provide adequate quantitative predictions in terms of bifurcation thresholds, and the average interspike intervals or bursting cycles of the exact system, as well as accurately anticipate the population’s response to different forms of external stimuli. Within this context, it will be examined whether the effective model is capable of reproducing the properties of noise-activated, noise-induced and noise-perturbed modes of collective behavior.

The paper is organized as follows. In Section I, we make an overview of the local map dynamics and introduce the population model. Section II outlines the ingredients most relevant for the derivation of the system, with the remaining technical details left for the Appendix. In Section III, the qualitative and quantitative agreement between the dynamics of the exact and the model is illustrated by the appropriate bifurcation diagrams, as well as by comparing the characteristic features of the associated regimes. Section IV concerns the assembly’s stimulus-response relationship, first investigating the analogy between the respective phase-response curves () of the exact system and the effective model in spiking and bursting regimes, and then considering the extent to which the model reproduces the population’s response to rectangular pulses of finite amplitude and duration. In Section V, we provide a summary of our main results.

I Map neuron dynamics and the population model

The dynamics of an isolated neuron conforms to a map model first introduced in NV07 (); CNV07 (), which is given by

(1)

where denotes the iteration step. The variable qualitatively accounts for the membrane potential, whereas the recovery variable , whose rate of change is set by a small parameter , mimics the behavior of ion-gating channels. The parameters and modify the profile of the ensuing oscillations, while crucially influences the neural excitability, viz. the transitions from silence to active regimes.

The evolution features two nonlinear terms, one being a FitzHugh-Nagumo-like cubic nonlinearity , which is complemented by a discontinuity term , where stands for the Heaviside step function. The parameters and are kept fixed throughout the paper. The impact of discontinuity consists in making the fast subsystem (Eq. (1) with ) a Lorenz-type map within certain parameter domains CNV07 (); MN13 (), which endows the model with the ability to generate chaotic spike or burst oscillations, otherwise lacking in the Fitzhugh-Nagumo type of systems.

Figure 1: (Color online) Dynamical regimes exhibited by model (1). The heat map refers to variation of the amplitude of oscillations of the time series in the plane. The waveforms shown in subfigures illustrate the different forms of neuron’s behavior, including excitability , subthreshold oscillations , regular spiking , chaotic bursting (), chaotic spiking , as well as the mixed spike-burst activity . The dots in the heat map indicate the particular values where the representative waveforms are obtained.

Under variation of and , the map (1) may reproduce a rich repertoire of generic regimes displayed by the real neurons, as demonstrated in Fig. 1. In particular, the main frame shows amplitudes of the corresponding time series for the given , while the remaining subfigures illustrate the characteristic waveforms pertaining to excitable regime (region ), subthreshold oscillations (), regular () or chaotic spiking (), chaotic bursting (), as well as the mixed chaotic spike-burst activity (). Some of the indicated boundaries, such as those involving domains and should be understood as tentative, since the associated transitions are smooth and therefore difficult to discern.

The detailed phase plane analysis concerning the relevant unstable invariant curves and the mechanisms underlying transitions between the different dynamical regimes can be found in MN14b (). Here we briefly mention that under increasing , the equilibrium loses stability via the Neimarck-Sacker bifurcation, which gives rise to subthreshold oscillations. Note that the latter may be considered as an excitable state, in a sense that a strong enough perturbation can elicit genuine spike, though the phase point does not relax to the equilibrium, but rather to a closed invariant curve.

Figure 2: (Color online) Impact of noise on a single map neuron in the excitable regime. (a) indicates the mechanism behind noise-induced spiking. The data are obtained for . The equilibrium is deterministically stable given that the line intersects the invariant curve below the curve’s minimum. (b) shows the series corresponding to noise-induced bursting (), whereas (c) demonstrates stochastic spiking superimposed on subthreshold oscillations ().

Adopting model (1) for local dynamics, we focus on a population of stochastic neurons coupled in the all-to-all fashion via electrical synapses (diffusive couplings). Each neuron receives input from the units within the assembly, and is further influenced by synaptic noise from the embedding environment. The population activity is then described by the following system

(2)

where specifies the particular neuron. The synaptic currents comprise two types of terms. The diffusive couplings are characterized by the strength , which is assumed to be uniform over the network and is set to in the remainder of the paper. The random inputs involve uncorrelated white noise () of intensity .

Confined to a single unit, the stochastic component may influence its dynamics either by perturbing the deterministic oscillatory regimes, or by inducing oscillations in the excitable regime, cf. Fig. 2(b). The onset of noise-induced spiking or bursting within the parameter domain where the fixed point is deterministically stable (domain in Fig. 1) corresponds to a phenomenon of stochastic bifurcation A99 (); AB04 (); GBV11 (); KSM10 (); ZSSN05 (). The latter are typically described phenomenologically, in a sense that certain time-averaged quantities, such as the asymptotic probability distributions of relevant variables or the associated power spectra, exhibit a qualitative change under variation of noise intensity. For instance, in continuous-time systems, it has been shown that the stochastic Hopf bifurcation from a stochastically stable fixed point to a stochastically stable limit cycle is accompanied by the loss of Gaussian property for the asymptotic distributions of the appropriate variables TP01 (). At variance with standard deterministic bifurcations, where one clearly observes a critical value of the control parameter, the change of system’s behavior in noise-induced transitions is gradual ZSSN05 (). Note that noise can also play an important part in the region where the deterministic map shows subthreshold oscillations. Here noise can give rise to a form of dynamics reminiscent of mixed-mode oscillations, cf. Fig. 2(c).

So far, models similar to (2) have been applied to address a number of problems associated to collective phenomena in networks of coupled neurons, including synchronization of electrically coupled units with spike-burst activity NM11 (); CMN12 (), pattern formation in complex networks with modular architecture MKRN13 (); MN14 (); MN12 (), transient cluster activity in evolving dynamical networks MN15 (), as well as the basin stability of synchronization regimes in small-world networks MNK15 (). Within this paper, the collective motion will be described in terms of the global variables and .

Ii Derivation of the mean-field model

Considering a approximation, our main goal lies in deriving a reduced low-dimensional deterministic set of nonlinear difference equations whose dynamics is qualitatively analogous to the collective motion of the original system (2) comprised of coupled stochastic maps. In particular, the model should be able to generate all the regimes exhibited by the exact system, qualitatively reproducing the bifurcations that the latter undergoes. Also, applying the effective model, one should be capable of inferring with sufficient accuracy the parameter domains which admit the different collective states of the exact system, with the corresponding time series exhibiting similar characteristic quantitative features. Regarding the explicit effects of noise, the model is expected to account for the onset or suppression of different types of collective modes associated to macroscopic spiking or bursting activity, which are mediated by synchronization or desynchronization of individual neuron dynamics, respectively. The synchronization processes may be influenced by noise in a variety of ways, including the scenarios where noise acts as a perturbation to mainly deterministic (and chaotic) local oscillations, or the ones where noise plays a facilitatory role, in a sense that the collective mode emerges via synchronization of noise-induced local dynamics.

Given that we consider a system of discrete-time equations, one cannot adopt the usual method of deriving the model via Fokker-Planck formalism SZNS13 (). Nevertheless, an analytically tractable model may still be built by focusing on the evolution of cumulants LGNS04 (); FTVB14 (); FTVB13 (); ZSSN05 (), whereby the full density of states is factorized into a series of marginal densities. The advantage of such an approach is that the simplifying approximations aimed at truncating the underlying cumulant series can be introduced in a controlled fashion. Such approximations, stated in a form of closure hypothesis LGNS04 (), are required due to nonlinearity of the original system, which causes the dynamics of cumulants of the given order to be coupled to those of the higher order.

In our case, the derivation of the effective model incorporates an explicit Gaussian closure hypothesis LGNS04 (); FTVB14 (); FTVB13 (); ZSSN05 (), by which all the cumulants above second order are assumed to vanish. The collective dynamics is then described by a set of five variables (the first- and second-order cumulants), including

  • the means, given by , ;

  • the variances, defined as and ;

  • the covariance

The expressions for higher order moments in terms of the first- and second-order cumulants G04 (), such as

(3)

can be derived using the closure hypothesis.

The Gaussian approximation effectively amounts to an assumption that the relation

(4)

holds, whereby refers to expectation value obtained by averaging over an ensemble of different stochastic realizations. In other words, one supposes that the local variables are independent and are drawn from a normal distribution . We do not know a priori whether such an assumption is fulfilled or not, but can only judge on its validity by verifying the correctness of the predictions on the population dynamics provided by the model. Also note that the effective model concerns the assembly dynamics in the thermodynamic limit . The stochastic terms in this case can be neglected, as one may show them to contribute to finite size effects which scale as . This means that the influence of noise in our model is felt only via the noise intensity, which assumes the role of an additional bifurcation parameter.

Let us now illustrate the main technical points required for the derivation of the model. Our focus will lie with a couple of relevant examples, whereas the remaining details are provided in the Appendix. We begin by considering the dynamics of the global variable , which is given by

(5)

It is easy to see that there is no contribution from the coupling term. As far as the third term on the r.h.s. of Eq. (5) is concerned, using Eq. (3), one arrives at

(6)

In the last expression, we have dropped the time index for simplicity and have introduced the shorthand notation .

The key problem is how to treat the final term in the r.h.s. of Eq. (5). Our approach consists in replacing the assembly average by the expectation value , obtained by assuming that the local variables at an arbitrary time moment are normally distributed according to . The expectation may then be evaluated as

(7)

with the error function . In the above calculation, we have explicitly used the assumption on the independence of distributions of local variables at any given moment of time.

In a similar fashion, one may consider the dynamics, which constitutes the most demanding part of the derivation. In particular, proceeding from the definition, we obtain

(8)

As an illustration, let us evaluate one of the terms containing an average over the threshold function:

(9)

Again, the time indexes have been suppressed to simplify the notation.

Leaving the remaining elements of the derivation for the Appendix, we now state the final equations of the model in the thermodynamic limit

(10)

Iii Analysis of stability and bifurcations

Figure 3: (Color online) Heat maps in (a) and (b) show the dependencies and obtained by stochastic averaging for a network of neurons, respectively. (c) and (d) illustrate the analogous results for the model. The noise intensity in all instances is .

In this section, our goal is to demonstrate the qualitative and quantitative analogies between the dynamics of the exact system and the model. To this end, we first examine the succession of macroscopic regimes in the parameter plane for fixed at an intermediate value , see Fig. 3. As in case of a single unit, changing is relevant for the system’s excitability, viz. the transitions from silent to active regimes, while influences the waveforms of the active states (spiking, bursting, or mixed spike-bursting activity). The assembly is found to exhibit the collective modes which qualitatively correspond to the dynamics of a single unit illustrated in plates of Fig. 1. The heat maps in the left column of Fig. 3 provide a comparison between the oscillation amplitudes of the global variable (top row) and the variable (bottom row) for the given . The right column indicates how well are matched the average interspike interval (or the average bursting cycle) of the exact system with the corresponding characteristics of the dynamics of the model (II). In the given instances, exact system comprises an assembly of neurons, having obtained by averaging over a sufficiently long time series, whereas is determined by taking average over an ensemble of different stochastic realizations. With regard to , we have selected a convenient threshold , which allows a clear detection of individual spikes, and also enables one to unambiguously discern the initiation stage of bursts, as required for calculating the length of the bursting cycle.

Figure 4: (Color online) Macroscopic excitability feature. In (a) and (b) are shown the maximum values of and reached within the time series of the exact and the system, starting from the analogous initial conditions and , respectively. The parameters are . (c) illustrates the case where a strong enough perturbation elicits a single-spike response (), whereas (d) corresponds to a bursting response made up of three spikes (). In both instances, the time series of the model (dotted line) is indistinguishable from that of the exact system (dashed line).
Figure 5: (Color online) (a) shows a family of curves over for a network of size under fixed . Superimposed are the results for the model, whereby the symbols correspond to cases and , respectively. (b) and (c) illustrate the series associated to the spiking and the bursting collective modes. The considered network is made up of neurons, with the parameters set to in (b), and in (c). In (d) and (e) are provided the series obtained for parameters from (b) and (c).

Let us begin the analysis by focusing on the domain of values where the exact system exhibits the stochastically stable equilibrium, while the model has a stable stationary state. The stochastic stability physically implies that fluctuations around the deterministic fixed point are typically of the order of noise, though some rare spikes may still be evoked. For sufficiently close to the region admitting the subthreshold oscillations, the population manifests macroscopic excitability. To properly illustrate this feature, we have analyzed the assembly dynamics in the limit , cf. Fig. 4. In particular, figures 4(a) and 4(b) show the maximum and values reached in the corresponding time series obtained for sets of different initial conditions and , respectively. The comparison between the two plots clearly corroborates that the boundary defining the domain of spiking response is appropriately anticipated by the model. An important remark is that for the given , the assembly may exhibit different forms of macroscopic excitability, generating a single spike or a burst of spikes, as dependent on the value of . This is demonstrated by the time series in figures 4(c) and 4(d). The former refers to a one-spike response in case of . For smaller , one observes responses comprising two or more closely packed spikes, with Fig. 4(d) illustrating a three-spike burst encountered for . Note that the time series of the full system and the model are exactly matched in the limit .

Next we address the noise-influenced transitions from silence to active regimes observed under increasing . To do so, in Figure 5(a) we have plotted the change of the firing (spiking or bursting) frequency for an assembly consisting of neurons. The average frequency is determined by considering an ensemble of different stochastic realizations, having fixed to the moderate value from Fig. 4. The results from simulations of the full system (2) are compared against the data obtained for the model. In this context, two points should be stressed. First, for moderate , note that the firing frequencies of the model lie in close agreement to those of the exact system. As a second point, one finds that such quantitative agreement extends to different forms of collective behavior, viz. it holds for different types of transitions from silent to active regimes. As already indicated, the waveforms pertaining the active states depend on , such that the associated transitions are mediated by the distinct synchronization processes. For instance, at , synchronization involves time series of single units that conform to spiking activity of type from Figure 1, which are quite resilient to impact of noise. On the other hand, for or , the individual units exhibit chaotic bursting or spiking activity, respectively, such that the underlying synchronization process may be more susceptible to stochastic effects. The typical time series illustrating the different collective modes are compared to the corresponding series in figures 5(b)-(e). The top (bottom) row concerns the data for the exact system ( model).

Figure 6: (Color online) Family of curves over obtained for a network of neurons under fixed . The different symbols correspond to cases (squares), (circles), (triangles) and (diamonds). The crosses connected by the dashed line highlight the curve for the model at .

In order to investigate more closely the influence of noise for interval in vicinity of the transition from silence to active regimes, we examine how the profiles of curves change under increasing . The results shown in Fig. 6 refer to and a population comprised of neurons. As expected, the transition appears quite sharp for moderate noise , but is considerably flattened for larger , e. g. . The crosses indicate the firing frequencies predicted by the model for .

For larger , the model fails to reproduce the behavior of the exact system in vicinity of threshold , in a sense that it overestimates the maximal value, as well as the actual critical characterizing the transition. Viewed from another angle, one may infer that for sufficiently large and below the threshold given by the model, the latter fails to capture the impact of synchronization processes taking place between the noise-induced oscillations of individual units. This especially refers to interval where the spikes or bursts (depending on the given ) are superimposed on the background of subthreshold oscillations. An example of such a discrepancy between the behavior of the exact and the effective system is provided in Fig. 7, cf. Fig. 7(a) and Fig. 7(c). Also, for strong and values above the transition, the firing frequencies anticipated by the effective model are typically higher than those of the exact system (not shown). Within this region, the stochastic effects suppress synchronization between the chaotic oscillations of single neurons, thereby reducing the corresponding value. This is not accounted for with sufficient accuracy by the system. Note that such suppression of synchronization is reflected in the corresponding series by the spike (burst) ”skipping” mechanism, where the large-amplitude oscillations are occasionally replaced with subthreshold oscillations. For the associated and values, such a phenomenon is absent in the dynamics of the effective model, cf. Fig. 7(b) and Fig. 7(d). In both of the scenarios illustrated in Fig. 7, the reason for having the model fail lies in that the Gaussian approximation behind it breaks down due to large stochastic fluctuations.

Figure 7: Noise-induced phenomena within the interval in vicinity of the deterministic threshold. series in (a) shows the noise-induced spike-bursting activity on top of subthreshold oscillations . (b) illustrates the ”skipping” phenomenon where the stochastic effects occasionally suppress the large-amplitude oscillations of the variable . In (c) and (d) are provided the series corresponding to parameter sets from (a) and (b), respectively.

The fashion in which the validity of the effective model’s predictions deteriorates with increasing is made more explicit in Fig. 8, which shows the and dependencies for the exact and the approximate system at fixed . The considered size of the network is . Comparison between the respective (left column) and plots (right column) suggests that the range of values where the approximation applies is contingent on . For instance, in the region below the deterministic threshold, one may estimate this range by noting that the effective bifurcation diagram in Fig. 8(a) indicates that noise-induced macroscopic oscillations emerge for . Since this point is not adequately represented by the effective model, cf. Fig. 8(c), one may state that the Gaussian approximation breaks down around within the given region. Nevertheless, for above the deterministic threshold, the validity of the model appears to depend rather strongly on particular , with the values where the Gaussian approximation effectively fails spanning the range .

Figure 8: (Color online) (a) and (b) respectively refer to and dependencies for the network of neurons under fixed . The results in (a) are obtained by averaging over a sufficiently long time series, whereas data in (b) derive from averaging over an ensemble of different stochastic realizations. In (c) and (d) are provided the and dependencies determined by numerical simulations of the model.

So far, we have investigated the impact of noise by comparing the results for the network of size to those obtained for the effective system. Nevertheless, within Section II, it has already been emphasized that the model, deterministic in character, refers to the system’s behavior in the thermodynamic limit , whereas the explicitly stochastic terms could only be incorporated as finite-size effects. This makes it relevant to examine how the behavior of the exact system within the domain around deterministic threshold changes for large and fixed under increasing . To this end, we have plotted in Fig. 9 the curves calculated for (squares), (circles) and (diamonds) at fixed . The curve for evinces that the given value is quite large in a sense of being sufficient to induce collective oscillations within the excitable regime. Apart from the dependencies for the full system, we also show the curve associated to the model (dashed line with crosses). An interesting point regarding the latter is that the threshold for the emergence of the collective mode is shifted toward a larger value compared to the case . While the given transition itself appears quite sharp, the curves corresponding to the exact system approach it with increasing , both in terms of the threshold and the values above the transition. This corroborates that the domain where the Gaussian approximation behind the model fails expectedly reduces with the increasing system size.

Figure 9: (Color online) dependencies for increasing under fixed . The squares, circles and diamonds correspond to cases , and , respectively. The results predicted by the model are indicated by crosses connected via dashed line.

Iv Response to external stimuli

The aim of this section is to investigate the extent to which the model can be used to predict the stimulus-response relationship of an assembly exhibiting different macroscopic regimes, including the excitable state, as well as the spiking and bursting collective modes. Let us first focus on the two latter instances and examine the sensitivity of a population to an external pulse perturbation within the framework of phase resetting theory SPB12 (); T07 (); I07 (); C06 (). In order to compare the behavior of the exact system and the effective model, we determine the corresponding phase resetting curves (s), which describe the phase shift , induced by the perturbation, in terms of the phase when the perturbation is applied. The considered stimulus has a form of a short pulse current , whose magnitude and width are small compared to the amplitude and duration of the spiking (or bursting) cycle , respectively. In case of the exact system, the same pulse current is delivered to each neuron , adding the term to dynamics, whereas in the effective model, stimulation is administered via the variable. The phase is defined in reference to by . The associated phase difference following the reset is calculated as , where denotes the duration of the perturbed spiking or bursting cycle.

Figure 10: (Color online) Assembly phase resetting. (a) and (b) show the s for a population in spiking regime () under excitatory () and inhibitory stimulation (), respectively. Results for the exact system () are indicated by the solid line, whereas the data for the model are denoted by circles. The bottom row illustrates the s for an assembly exhibiting macroscopic bursting (), whereby (c) describes the effect of an excitatory (), and (d) of an inhibitory pulse perturbation (). The insets in (a) and (c) demonstrate how the phases are assigned to the points within the spiking and bursting cycles, respectively. Phase is expressed in units of .
Figure 11: Stimulus-response relationship in the excitable regime (). The top (middle) row refers to the response of the full system ( model), whereas the bottom row shows the profile of the external stimulation. In the panels (a)-(c), the system parameters are , while the perturbation is characterized by . Panels (d)-(f) concern the response of an assembly () subjected to a rectangular pulse . Panels (g)-(i) illustrate the response of a population () influenced by the external stimulation . The considered network is of size .

The s characterizing the assembly response in the spiking regime are provided in Fig. 10(a) and Fig. 10(b), whereby the former is obtained under the action of an excitatory (), and the latter under the influence of an inhibitory stimulation (). We stress that in both instances, the results derived from the effective model, denoted by circles, show excellent agreement with the data for the exact system (solid lines). In qualitative terms, one observes that an excitatory stimulation may advance the phase of the spiking cycle if it arrives sufficiently close to the spike, but still before the sharp rising stage. Nevertheless, an excitatory perturbation which acts during the spike or within the effective refractory period has a suppression effect, reflected in delaying of the next spike. At variance with the excitatory stimulation, the inhibitory pulse postpones the next firing time if it is introduced within the interval close to the rising stage of spike.

The s determined for an assembly exhibiting collective bursting show qualitatively analogous effects to those described so far, see Fig. 10(c) and Fig. 10(d). This especially refers to impact of perturbation delivered sufficiently close to a moment of burst initiation. An apparent difference compared to Fig. 10(a) and Fig. 10(b) emerges during the bursting stage itself, where the associated s expectedly exhibit strong fluctuations. Apart from that, one finds an interesting effect that both the excitatory and the inhibitory stimulation have a facilitatory role, i. e. cause phase advancement during the relaxation stage of the bursting cycle.

For a population in the excitable state, we consider scenarios where the system is influenced by a rectangular pulse perturbation of finite magnitude and duration, in a sense that the latter are comparable to corresponding features of typical spiking or bursting cycles. Note that the selected value lies sufficiently away from the interval admitting the subthreshold oscillations. Again, our objective is to determine whether the model correctly anticipates the response of the exact system, now in presence of small to moderate noise. Some of the illustrative examples concerning the stimulus-response relationship under the finite perturbation are provided in Fig. 11. The top and the middle row refer to and corresponding time series, respectively, while the bottom row shows the profile of the applied stimulus. We find that in the absence of noise or for sufficiently small , the effective model reproduces the evoked behavior of the full system quite accurately. This also refers to some highly complex forms of responses, as corroborated in Fig. 11(a)-(c), which concern relatively large and . Under increasing , the ability of the model to predict the dynamics of the exact system gradually reduces, but in a fashion that involves a nontrivial dependence on . In particular, for smaller , which would facilitate macroscopic spiking mode for supercritical , it turns out that the dynamics of the model lies in close agreement to the one of the exact system even for moderate noise , cf. Fig. 11(d)-(f). However, for higher , such an analogy between the responses of the exact and the system is lost, see Fig. 11(g)-(i). Naturally, the validity of the predictions given by the model deteriorates if the stimulation amplitude and the duration are large, especially in presence of non-negligible noise.

V Summary and discussion

We have developed a approach in order to systematically analyze the emergent dynamics and the input-output relationship of a population of stochastic map neurons. The reduced low-dimensional model has been derived within the framework of Gaussian approximation, formally introduced in a form of a closure hypothesis. In physical terms, such an approximation suggests that the local variables at an arbitrary moment of time are independent and conform to a normal distribution centered about the assembly mean and characterized by the associated assembly variance. Validity of such an approximation cannot be established a priori, but has been systematically verified by numerically corroborating that the model reproduces the behavior of the exact system with sufficient accuracy.

In particular, we have first demonstrated that the effective model can qualitatively capture all the bifurcations of the exact system leading to the onset of different generic regimes of collective behavior. As far as the quantitative agreement is concerned, we have established substantial matching between the parameter domains admitting the respective dynamical regimes for the exact and the approximate system. Moreover, the typical features of the associated regimes, such as the average interspike interval or the average bursting cycle, exhibit analogous changes with parameter variation, and in many parameter domains display numerically similar values.

An important issue has been to explicitly examine how the effects of noise are reflected in the behavior of the model. For the noise-perturbed activity, where the sufficiently small noise weakly influences the deterministic attractors of the system, the obtained results indicate that the Gaussian approximation holds. Nevertheless, the physical picture changes in case of noise-induced collective behavior. In particular, for different scenarios of stochastic bifurcations, typically corresponding to transitions from subthreshold oscillations, which involve generalized excitability feature, to spiking or bursting regimes, the exact system undergoes a gradual (smooth) change of collective dynamics, whereas the model exhibits a standard deterministic bifurcation with a sharp bifurcation threshold. In such instances, the collective variables of exact system manifest large fluctuations, which explicitly violate the Gaussian approximation behind the effective model. Note that the loss of Gaussianity property for asymptotic distribution of relevant variables, which accompanies the described stochastic bifurcations, does not imply per se that our Gaussian approximation fails in the supercritical state. This point is evinced by the fact that the dynamics of the effective model shows qualitatively and quantitatively similar features to those of the exact system if the considered parameters lie sufficiently above the stochastic bifurcation. In fact, the Gaussian approximation applied in the derivation of the model breaks down only in vicinity of such transitions, where the finite-size effects neglected in Eq. (II) become most prominent. We have numerically verified the prevalence of finite-size effects in these parameter domains, showing that the change of the appropriate order parameter, such as the spiking frequency, becomes sharper as the size of the neural assembly is increased. Nevertheless, the validity of Gaussian approximation is regained once the system is sufficiently above the bifurcation.

Apart from considering asymptotic dynamics, we have verified that the model is capable of capturing the stimulus-response features of the exact system. For short pulse-like perturbations, it has been found that the approximate system reproduces the s of the exact system for both the spiking and bursting regimes of collective activity with high accuracy. Substantial analogies have also been observed in case of macroscopic excitable regime for scenarios where the assembly is stimulated by rectangular pulse perturbations of finite amplitude and duration.

Having developed a viable approach, the present research has set the stage for a more systematic exploration of collective dynamics of assemblies of map neurons by analytical means. We believe that the introduced techniques can be successfully applied for treating the emergent behavior of populations in case of chemically and delay-coupled neurons MN14 (). Moreover, the method may likely be used to explore the effects of parameter inhomogeneity, as well as to study the impact of complex network topologies MN14 (); MNK15 (). Our ultimate goal will be to extend the approach to account for collective behavior of interacting populations of map neurons MKRN13 (); MN14 ().

Acknowledgements.
This work is supported by the Ministry of Education, Science and Technological Development of Republic of Serbia under project No. , and by the Russian Foundation for Basic Research under project No. 15-02-04245.

Vi Appendix

In the following, we provide the remaining details concerning the calculation of the dynamics, which is the most complex part of the derivation of the effective model. Following some algebra, Eq. (II) can be transformed to

(11)

The partial results required for completing the calculation are given by

(12)

where . Note that the time indexes have been omitted for simplicity. After some tedious work, it may also be shown that the expression for variance reads

(13)

Let us now explicitly calculate the terms containing the threshold function. First we have

(14)

Note that the second term containing the threshold function has been evaluated in the main text, cf. Eq. (II).

Finally, let us address the term , which can be estimated by considering the associated expectation . Applying the technique introduced in Sec. II, we obtain

(15)

Given that , one arrives at

(16)

This shows that the variance of the threshold function ultimately contributes to a finite-size effect which can be neglected in the thermodynamic limit.

References

  • (1) C. Zhou, L. Zemanova, G. Zamora, C. Hilgetag, and J. Kurths, Phys. Rev. Lett. 97, 238103 (2006).
  • (2) E. Bullmore, and O. Sporns, Nat. Rev. Neurosci. 10, 186 (2009).
  • (3) D. Meunier, R. Lambiotte, and E. Bullmore, Front. Neurosci. 4, 200 (2010).
  • (4) O. Sporns, D. Chialvo, M. Kaiser, and C. C. Hilgetag, Trends Cogn. Sci. 8, 418 (2004).
  • (5) Dynamic Coordination in the Brain, edited by C. von der Malsburg, W. A. Phillips, and W. Singer, (MIT Press, Cambridge, 2010).
  • (6) B. B. Averbeck, P. E. Latham, and A. Pouget, Nat. Rev. Neurosci. 7, 358 (2006).
  • (7) F. Varela, J. P. Lachaux, E. Rodriguez, and J. Martinerie, Nat. Rev. Neurosci. 2, 229 (2001).
  • (8) G. Buzsáki, Rhythms of the Brain, (Oxford University Press, Oxford, 2009).
  • (9) J. L. P. Velazquez, and R. Wennberg, Coordinated Activity in the Brain: Measurements and Relevance to Brain Function and Behavior, (Springer, New York, 2009).
  • (10) R. T. Canolty et al., Science 313, 1626 (2006).
  • (11) J. E. Lisman, and O. Jensen, Neuron 77, 1002 (2013).
  • (12) Y. Baibolatov, M. Rosenblum, Z. Z. Zhanabaev, M. Kyzgarina, and A. Pikovsky, Phys. Rev. E 80, 046211 (2009).
  • (13) O. V. Maslennikov, and V. I. Nekorkin, Phys. Rev. E 90, 012901 (2014).
  • (14) O. V. Maslennikov, D. V. Kasatkin, N. F. Rulkov, and V. I. Nekorkin, Phys. Rev. E 88, 042907 (2013).
  • (15) O. V. Maslennikov, V. I. Nekorkin, and J. Kurths, Phys. Rev. E 92, 042803 (2015).
  • (16) O. V. Maslennikov, and V. I. Nekorkin, Commun. Nonlinear Sci. Numer. Simul. 23, 10 (2015).
  • (17) N. F. Rulkov, Phys. Rev. E 65, 041922 (2002).
  • (18) N. F. Rulkov, I. Timofeev, and M. Bazhenov, J. Comput. Neurosci. 17, 203 (2004).
  • (19) D. Q. Wei, and X. S. Luo, Europhys. Lett. 77, 68004 (2007).
  • (20) Q.Y. Wang, Z. Duan, M. Perc, and G. Chen, Europhys. Lett. 83, 50008 (2008).
  • (21) C. A. S. Batista, A. M. Batista, J. A. C. de Pontes, R. L. Viana, and S. R. Lopes, Phys. Rev. E 76, 016218 (2007).
  • (22) B. Ibarz, J. M. Casado, and M. A. F. Sanjuán, Phys. Rep. 501, 1 (2011).
  • (23) I. Franović, and V. Miljković, Europhys. Lett. 92, 68007 (2010).
  • (24) I. Franović, and V. Miljković, Commun. Nonlinear Sci. Numer. Simul. 16, 623 (2011).
  • (25) E. M. Izhikevich, Neural Comput. 18, 245 (2006).
  • (26) E. M. Izhikevich, and G. M. Edelman, Proc. Natl. Acad. Sci. USA 105, 3593 (2008).
  • (27) S. E. Folias, and P. C. Bressloff, Phys. Rev. Lett. 95, 208107 (2005).
  • (28) C. R. Laing, W. C. Troy, B. Gutkin, and G. B. Ermentrout, SIAM J. Appl. Math. 63, 62 (2002).
  • (29) P. C. Bressloff, Phys. Rev. E 82, 051903 (2010).
  • (30) M. A. Buice, and J. D. Cowan, Phys. Rev. E 75, 051919 (2007).
  • (31) N. Brunel, and V. Hakim, Neural Comput. 11, 1621 (1999).
  • (32) H. Hasegawa, Phys. Rev. E 67, 041903 (2003).
  • (33) B. Lindner, J. Garcia-Ojalvo, A. Neiman, and L. Schimansky-Geier, Phys. Rep. 392, 321 (2004).
  • (34) I. Franović, K. Todorović, N. Vasović, and N. Burić, Phys. Rev. E 89, 022926 (2014).
  • (35) I. Franović, K. Todorović, N. Vasović, and N. Burić, Phys. Rev. E 87, 012922 (2013).
  • (36) I. Franović, K. Todorović, N. Vasović, and N. Burić, Chaos 22, 033147 (2012).
  • (37) V. Klinshov, and I. Franović, Phys. Rev. E 92, 062813 (2015).
  • (38) M. A. Zaks, X. Sailer, L. Schimansky-Geier, and A. B. Neiman, Chaos 15, 026117 (2005).
  • (39) B. Sonnenschein, M. A. Zaks, A. B. Neiman, and L. Schimansky-Geier, Eur. Phys. J. Special Topics 222, 2517 (2013).
  • (40) V. I. Nekorkin, and L. V. Vdovin, Izv. Vyssh. Uchebn. Zaved. Prikladn. Nelinejn. Din 15, 36 (2007).
  • (41) M. Courbage, V. I. Nekorkin, and L. V. Vdovin, Chaos 17, 043109 (2007).
  • (42) O. V. Maslennikov, and V. I. Nekorkin, Chaos 23, 023129 (2013).
  • (43) O. V. Maslennikov, and V. I. Nekorkin, ”Map-Based Approach to Problems of Spiking Neural Network Dynamics” in Nonlinear Dynamics and Complexity, V. Afraimovich, A. C. J. Luo, and X. Fu (Editors) (Springer International Publishing Switzerland, 2014).
  • (44) L. Arnold, Random Dynamical Systems, (SpringerVerlag, Berlin, 1999).
  • (45) J. A. Acebrón, A. R. Bulsara, and W.-J. Rappel, Phys. Rev. E 69, 026202 (2004).
  • (46) M. Gaudreault, J. M. Berbert, and J. Viñals, Phys. Rev. E 83, 011903 (2011).
  • (47) P. Kaluza, C. Strege, and H. Meyer-Ortmanns, Phys. Rev. E 82, 036104 (2010).
  • (48) S. Tanabe and K. Pakdaman, Phys. Rev. E 63, 031911 (2001).
  • (49) V. I. Nekorkin, and O. V. Maslennikov, Radiophys. Quantum Electron. (Engl. Transl.) 54, 56 (2011).
  • (50) M. Courbage, O. V. Maslennikov, and V. I. Nekorkin, Chaos Soliton. Fract. 45, 645 (2012).
  • (51) O. V. Maslennikov, and V. I. Nekorkin, Radiophys. Quantum Electron. (Engl. Transl.) 55, 198 (2012).
  • (52) C. W. Gardiner, Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences, 3rd ed. (Springer-Verlag, Berlin, 2004).
  • (53) Phase Response Curves in Neuroscience: Theory, Experiment, and Analysis, edited by N. W. Schultheiss, A. A. Prinz, and R. J. Butera, (Springer, New York, 2012).
  • (54) P. A. Tass, Phase Resetting in Medicine and Biology: Stochastic Modeling and Data Analysis (Springer, Berlin, Heidelberg, 2007).
  • (55) E. M. Izhikevich, Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting (MIT Press, Cambridge, 2007), Chap. 10.
  • (56) C. C. Canavier, Scholarpedia 1, 1332 (2006).
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
12895
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description