Identification of functional information subgraphs in complex networks

Identification of functional information subgraphs in complex networks

Luís M. A. Bettencourt [    Vadas Gintautas [    Michael I. Ham [ T-7 and CNLS, Theoretical Division, MS B284 Los Alamos National Laboratory, Los Alamos NM 87545, USA
September 15, 2019
Abstract

We present a general information theoretic approach for identifying functional subgraphs in complex networks where the dynamics of each node are observable. We show that the uncertainty in the state of each node can be expressed as a sum of information quantities involving a growing number of correlated variables at other nodes. We demonstrate that each term in this sum is generated by successively conditioning mutual informations on new measured variables, in a way analogous to a discrete differential calculus. The analogy to a Taylor series suggests efficient search algorithms for determining the state of a target variable in terms of functional groups of other degrees of freedom. We apply this methodology to electrophysiological recordings of networks of cortical neurons grown in vitro. Despite strong stochasticity, we show that each cell’s patterns of firing are generally explained by the activity of a small number of other neurons. We identify these neuronal subgraphs in terms of their mutually redundant or synergetic character and reconstruct neuronal circuits that account for the state of each target cell.

pacs:
87.10.+e,87.18.Sn,87.17.Nn,05.45.Tp,84.35.+i

Also at the ]Santa Fe Institute, 1399 Hyde Park Road, Santa Fe NM 87501, USA Also at the ]Center for Complex Systems Research, Dept. of Physics, University of Illinois at Urbana-Champaign, Urbana IL 61801, USA Also at the ]Center for Network Neuroscience, University of North Texas, Denton TX 76203, USA

Information plays a central role in conditioning structure and determining collective dynamics in many complex systems. For example, the ability to process and react to information certainly influences how neurons and synapses, or genes and proteins, interact in large numbers to generate the complexity of cognitive and biological processes. Despite their importance, however, systematic methodologies for identifying functional relations between units of successive complexity, involved in information processing and storage, are still largely missing.

Motivated by recent theoretical developments and experimental breakthroughs, new interest has arisen in applications of information theory to dynamical and statistical systems with many degrees of freedom Borst and Theunissen (1999); Schreiber (2000); Paluš et al. (2001). Specifically, it has been shown that information quantities can identify and classify spatial Bialek et al. (2001) and temporal Crutchfield and Feldman (2003) correlations, and reveal if a group of variables may be mutually redundant or synergetic Schneidman et al. (2003); Bettencourt et al. (2007). In this way an information theoretic treatment of groups of correlated degrees of freedom can reveal their functional roles in terms of arrangements that can serve as memory structures or those capable of processing information.

The application of these insights to identify functional connectivity structure is still just beginning Bettencourt et al. (2007) but should provide a useful complement to other established approaches Stephan et al. (2000); Milo et al. (2002); Yeger-Lotem et al. (2004); Ziv et al. (2005) by directly relating observable dynamics or statistics to information structures. To date, the identification of functional relations between nodes of a complex network has relied on the statistics of motifs. These are specific (directed) subgraphs of nodes that appear more abundantly than expected in randomized networks with the same number of nodes and degree of connectivity Stephan et al. (2000); Milo et al. (2002); Yeger-Lotem et al. (2004). Although powerful for small subgraphs, this approach scales up poorly since the number of different subgraphs explodes combinatorially with increasing number of nodes . Consequently, the extensive searches that are necessary for measuring motif frequencies become prohibitive beyond about . A general solution to this curse of dimensionality is to perform targeted searches guided by quantitative expectations for finding the most informative node combinations relative to an external signal or to other parts of the system.

Here we present such an approach, based on the rigorous properties of information theory applied to the correlated statistical state of many variables. We show how the uncertainty in the state of any target variable, quantified by its Shannon entropy, can be expressed in terms of a cluster expansion of information quantities involving a successively larger number of variables. The sign and magnitude of each term in the expansion determines the functional connectivity among nodes to that order; specifically whether a set of nodes is functionally independent, redundant, or synergetic. Because the Shannon entropy is positive definite, this expansion gives a systematic approximation to the state of the target. As a result the expansion can be truncated at any order and used to construct approximate non-exhaustive search algorithms, analogous to gradient methods in other optimization problems. We demonstrate the efficacy of this method through its application to spike time series of cortical neuronal networks grown in vitro.

Information is a relative quantity, quantifying the increase in predictability (reduction in uncertainty) of a variable’s statistical state given knowledge of others with which it is correlated. Specifically, the uncertainty in the state of can be quantified by its Shannon entropy Cover and Thomas (1991) , where are the marginals for each state of . Note that , where corresponds to precise knowledge of and the probability distribution for some state . Measuring correlated variables to contributes to knowledge of its state and reduces its uncertainty, thus

(1)

with for total variables and where refers to the conditional entropy of given  Cover and Thomas (1991). We use the notation to refer to the set . The difference between the entropy of and its entropy given the joint state of a set is the information in the set:

(2)

These relations also specify the optimization problem of minimizing the uncertainty in given measurements within a larger (possibly infinite) set. Specifically, if a set exists at some order so that , and therefore , then it fully determines the state of and no uncertainty remains. Each measurement can only reduce or leave unchanged , while information quantities are symmetric under permutation of the , so that the maximal entropy reduction from any given set is unique. The challenge resides in finding the measurement set of size resulting in the smallest remaining uncertainty. The computational complexity of this search grows combinatorially with the number of arrangements of size within variables, which quickly becomes prohibitive. To evade this problem, we introduce the exact expansion

(3)

The variational operators in Eq. (3) define the change in entropy resulting from a measurement as

(4)
(5)
(6)

and so on. Higher order variations follow automatically from the successive application of the first variation, resulting in a simple chain rule. Thus, variations to any order are symmetrical under permutations of the .

Figure 1: (a) Neuronal culture over a microelectrode array (white circles; barm). (b) Detail of a spike timeseries. The box shows network state (bottom to top).

This expansion has two important properties. First, each term in the expansion at order accounts for an irreducible set of correlations among a size- group of nodes with the target . Statistical independence among any of the results in a vanishing contribution to that order and terminates the expansion. For example, if all are mutually independent, all variations for vanish identically and the information about is given by , that is, the first order terms in Eq. (3). If the are correlated in pairs, but not in higher order multiplets, then only terms with will be present, and so on. Thus, for a system where not all correlations are realized, expression Eq. (3) allows the identification of correlated submultiplets, and determines their mutual organization in specifying the state of .

The second important property of this expansion is that the sign of each nonvanishing variation reveals the informational character of the corresponding multiplet. Specifically, a negative sign indicates that the -multiplet contributes to the state of with more information than the sum of all its subgroups (synergy), while a positive sign indicates the opposite (redundancy). We define a synergetic (redundant) core as a set such that its variation and the variations of all its subgroups of two or more variables are negative (positive). Explicit examples where the are inputs of a logical circuit and is the output (e.g. an AND circuit) confirm that the sign of any variation of the identifies synergetic arrangements to any order. Likewise, arrangements where the same information is shared among some of the , as in a Markov chain, result in the sign of the variation indicating redundancy. Examples of these relations to low orders () have been worked out recently Schneidman et al. (2003); Bettencourt et al. (2007), and their detailed generalization will appear elsewhere Gintautas et al. (2007). We also note that the concept of order-by-order synergy or redundancy captured by each of the terms in Eq. (3) generalizes the coefficient of redundancy proposed by Schneidman et al. Schneidman et al. (2003), which refers to the global information deficit (or excess if ) of a multiplet, relative to only the first term in Eq. (3).

For the remainder of this Letter, we use the expansion in Eq. (3) to define the optimization problem of determining the set and decomposition of the in terms of functional information arrangements that best account for the stochastic behavior of a target . Because the entropy for all , this approach defines a well posed optimization problem, with a single global minimum for each set of possible measurements.

To illustrate this methodology, we apply it to temporal action potential activity from murine frontal cortex neuronal cultures grown in vitro on non-invasive microelectrode arrays (MEAs) Maeda et al. (1995); Keefer et al. (2001); Haldeman and Beggs (2005). Fig. 1(a) shows an example network growing on an MEA and Fig. 1(b) typical time series data. Details of MEA fabrication and culture preparation are described elsewhere Gross and Schwalm (1994); Keefer et al. (2001); Bettencourt et al. (2007). These experimental platforms have become model systems for studying living neuronal networks in controlled environments. Recent progress includes studies of dynamical patterns of collective activity Segev et al. (2002); Beggs and Plenz (2003, 2004); Wagenaar et al. (2006); Ham et al. (2007), connectivity structure Jia et al. (2004); Bettencourt et al. (2007), network growth and development Wagenaar et al. (2006), and even learning and activity pattern modification Jimbo et al. (1999); Marom and Shahaf (2002); DeMarse et al. (2001) via external stimulation. Results presented here refer to cells of a mature ( days in vitro; see Tateno et al. (2002)) cortical network. To analyze patterns of neuronal activity, binary states are constructed [see Fig. 1(b)] for each recorded neuron’s time series using temporal bins of ms; is recorded if a neuron fires during within a bin and otherwise. Probability distributions for states of neurons are estimated via frequencies and provide the basis for calculating information theoretic quantities. Probabilities are considered significant if substantially larger than from a null model with randomized spiking at observed rates for each neuron. Nearly all of the network activity occurs as global coordinated spiking events, known as network bursts or avalanches Segev et al. (2001); Beggs and Plenz (2003); Wagenaar et al. (2006); Ham et al. (2007).

Figure 2: Entropy remaining when other neurons are measured with respect to neuron 46. Rank is determined by maximizing the variation to various orders. The neuron numbers appear for the exact curve. Inset: Histogram of entropy of each neuron remaining after all possible measurements.

Fig. 2 shows the relative entropy reduction of a target neuron, due to successive measurements of other neurons. Different lines correspond to searches for the optimal sequence of measurements at different orders of approximation in the expansion in Eq. (3). A search to exact order means that all are considered, given the previous , and the set with greatest information gain is chosen. Most neurons show an initial large drop in entropy due to the measurement of only a few other cells in the network (typically ) and a subsequent slower information gain as more cells are measured.

Fig. 2(inset) shows the histogram of the ratio of final to initial entropy for all neurons. Final entropy refers to the fraction of a neuron’s initial entropy left unaccounted for once the set of all other available neurons is measured. Remarkably, the stochastic patterns of most cells can be nearly fully predicted by the activity of others, even if most degrees of freedom in the actual network remain unobserved (we estimate that only about of all neurons are measured). To better understand the informational nature of arrangements of neurons we show in Fig. 3(a) for each of the measured cells in the network. By this measure most cell groups are globally redundant (red) relative to their decomposition in terms of purely binary correlations to other cells. About a third of the cells, though, show substantial synergy (blue) that persists despite many sequential measurements. Fig. 3(b) shows the distribution of each term in the expansion in Eq. (3) to order . We include all multiplets up to order , and thereafter use a random sample of multiplets. Recall that the value and sign of each term in the expansion indicates redundancy or synergy relative to the sum of all submultiplets of lower order. Globally redundant multiplets often result in terms with alternating signs to lower orders, while a smaller number of multiplets corresponding to synergetic arrangements have negative contributions at every order.

Figure 3: (a) Sorted global information deficit/excess of a multiplet, relative to the sum of the pairwise mutual informations: . (b) Values of each term in the expansion in Eq. (3) vs. for 36,000 randomly sampled variable combinations. White to blue: ; red to yellow: .

Fig. 4(a) shows the frequency of synergetic and redundant cores, while Fig. 4(b) shows the reconstruction of circuits from functional subgraphs which account for the activity of target neuron of Fig. 2. Evidently the target neuron is part of both redundant and synergetic functional multiplets, with the former being substantially more abundant. The most informative neuron is labeled , but its information about the target is shared to a large extent with neurons and . The target neuron is also part of a synergetic circuit with other neurons, several of which are part of smaller mutually redundant subgraphs. Some of these can, at least partially, be interchanged with other neurons carrying the same information, resulting globally in an interconnected ensemble where specific synergetic functional relationships are embedded on robust redundant cell arrangements.

In summary, we present a new information theoretic approach to constructing functional subgraphs in complex networks where nodes display observable stochastic dynamics. By performing targeted searches guided by expected information gain from new measurements, we avoid some of the combinatorial issues usually involved in the search for motifs in complex networks. We apply this approach to action potential time series from networks of neurons and find that the activity of most neurons is to a large extent determined by the observation of other cells in the network. This finding is remarkable because only a small portion () of cells are accessible to measurement, indicating that large amounts of redundancy characterize neural network dynamics in these cultures. Although the activity of many neurons can be substantially accounted for by a relatively small number of other cells, an important fraction of a neuron’s entropy and detailed firing patterns is contained in multiple cell arrangements of varying size. These findings agree well with recent neuronal network reconstructions in terms of binary correlations Schneidman et al. (2006) and small multiplets Bettencourt et al. (2007), but also provide a new view of the contribution of higher order functional correlations. The identification of functional connectivity subgraphs in living neuronal cultures is critical for designing future experiments that promote computational tasks within neural networks, and should find applications more generally in other complex systems.

Figure 4: (a) Frequency of redundant (red) and synergetic (blue) cores versus size . (b) Purely redundant (red) and purely synergetic (blue) circuits relative to neuron . Neurons and groups with the most information about are closest to the center; c.f. Fig.2. Arcs identify neurons that participate in multiple functional groups.

We thank G. W. Gross for sharing his extensive expertise with growing and recording the activity of neuronal cultures. We also thank J. Crutchfield, A. Gutfriend, and A. Hagberg for helpful discussions. This work is supported by LANL’s LDRD project 20050411ER.

References

  • Borst and Theunissen (1999) A. Borst and F. E. Theunissen, Nature Neurosci. 2, 947 (1999).
  • Schreiber (2000) F. Schreiber, Prog. Surf. Sci. 65, 151 (2000).
  • Paluš et al. (2001) M. Paluš, V. Komárek, Z. Hrnčíř, and K. Štěrbová, Phys. Rev. E 63, 046211 (2001).
  • Bialek et al. (2001) W. Bialek, I. Nemenman, and N. Tishby, Physica A 302, 89 (2001).
  • Crutchfield and Feldman (2003) J. P. Crutchfield and D. P. Feldman, Chaos 15, 25 (2003).
  • Schneidman et al. (2003) E. Schneidman, W. Bialek, and M. J. Berry II, J. Neurosci. 23, 11539 (2003).
  • Bettencourt et al. (2007) L. M. A. Bettencourt, G. J. Stephens, M. I. Ham, and G. W. Gross, Phys. Rev. E 75, 021915 (2007).
  • Stephan et al. (2000) K. E. Stephan, C. C. Hilgetag, G. A. P. C. Burns, M. A. O’Neill, M. P. Young, and R. Kotter, Phil. Trans. R. Soc. Lond. B 355, 111 (2000).
  • Milo et al. (2002) R. Milo, S. Shen-Orr, S. Itzkovitz, N. Kashtan, D. Chklovskii, and U. Alon, Science 298, 824 (2002).
  • Yeger-Lotem et al. (2004) E. Yeger-Lotem, S. Sattath, N. Kashtan, S. Itzkovitz, R. Milo, R. Y. Pinter, U. Alon, and H. Margalit, Proc. Natl. Acad. Sci. U.S.A. 101, 5934 (2004).
  • Ziv et al. (2005) E. Ziv, R. Koytcheff, M. Middendorf, and C. Wiggins, Phys. Rev. E 71, 016110 (2005).
  • Cover and Thomas (1991) T. M. Cover and J. A. Thomas, Elements of Information Theory (Wiley, New York, 1991).
  • Gintautas et al. (2007) V. Gintautas, L. M. A. Bettencourt, and M. I. Ham, in preparation (2007).
  • Maeda et al. (1995) E. Maeda, H. P. Robinson, and A. Kawana, J. Neurosci. 15, 6834 (1995).
  • Keefer et al. (2001) E. W. Keefer, A. Gramowski, and G. W. Gross, J. Neurophysiol. 86, 3030 (2001).
  • Haldeman and Beggs (2005) C. Haldeman and J. M. Beggs, Phys. Rev. Lett. 94, 058101 (2005).
  • Gross and Schwalm (1994) G. W. Gross and F. U. Schwalm, J. Neurosci. Meth. 52, 73 (1994).
  • Segev et al. (2002) R. Segev, M. Benveniste, E. Hulata, N. Cohen, A. Palevski, E. Kapon, Y. Shapira, and E. Ben-Jacob, Phys. Rev. Lett. 88, 118102 (2002).
  • Beggs and Plenz (2003) J. M. Beggs and D. Plenz, J. Neurosci. 23, 11167 (2003).
  • Beggs and Plenz (2004) J. M. Beggs and D. Plenz, J. Neurosci. 24, 5216 (2004).
  • Wagenaar et al. (2006) D. A. Wagenaar, Z. Nadasdy, and S. M. Potter, Phys. Rev. E 73, 051907 (2006).
  • Ham et al. (2007) M. I. Ham, L. M. A. Bettencourt, G. W. Gross, and F. D. McDaniel, to appear in J. Comp. Neurosci. (2007).
  • Jia et al. (2004) L. C. Jia, M. Sano, P.-Y. Lai, and C. K. Chan, Phys. Rev. Lett. 93, 088101 (2004).
  • Jimbo et al. (1999) Y. Jimbo, T. Tateno, and H. P. C. Robinson, Biophys. J. 76, 670 (1999).
  • Marom and Shahaf (2002) S. Marom and G. Shahaf, Q. Rev. Biophys. 35, 63 (2002).
  • DeMarse et al. (2001) T. B. DeMarse, D. A. Wagenaar, A. W. Blau, and S. M. Potter, Auton. Rob. 11, 305 (2001).
  • Tateno et al. (2002) T. Tateno, A. Kawana, and Y. Jimbo, Phys. Rev. E 65, 051924 (2002).
  • Segev et al. (2001) R. Segev, Y. Shapira, M. Benveniste, and E. Ben-Jacob, Phys. Rev. E 64, 011920 (2001).
  • Schneidman et al. (2006) E. Schneidman, M. J. Berry II, R. Segev, and W. Bialek, Nature 440, 1007 (2006).
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
102677
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description