Generalized Synchronization of Coupled Chaotic Systems
In this paper we briefly report some recent developments on generalized synchronization. We discuss different methods of detecting generalized synchronization. We first consider two unidirectionally coupled systems and then two mutually coupled systems. We then extend the study to a network of coupled systems. In the study of generalized synchronization of coupled nonidentical systems we discuss the Master Stability Function (MSF) formalism for coupled nearly identical systems. Later we use this MSF to construct synchronized optimized networks. In the optimized networks the nodes which have parameter value at one extreme are chosen as hubs and the pair of nodes with larger difference in parameter are chosen to create links.
Study of synchronization of coupled chaotic system has attracted much attention recently Strogatz2008 (); Pikovsky2001 (); Boccaletti2002 (); Arenas2008 (). Synchronization of coupled dynamical systems can be defined as a process where two or more coupled systems adjust their trajectories to a common behavior. In the literature, the most commonly studied synchronization is between systems which have exactly the same or identical dynamical equations. Two identical coupled chaotic systems are said to be synchronized when the state variables of the coupled systems become equal. This type of synchronization is known as complete synchronization(CS) Fujisaka1983 (); Afraimovich1986 (); Pecora1990 (). However for coupled nonidentical systems it not possible to observe CS, instead one will find other form of synchronization such as phase synchronization(PS) Rosenblum1996 (), generalized synchronization(GS) Abarbanel1995 (), etc. PS is a weaker form of synchronization, where the phases of the coupled systems become locked but their amplitudes are in general unrelated. In GS the state variables of the coupled systems are related by some function. GS occurs mainly for coupled nonidentical systems. The CS can be consider as special cases of GS. In this paper we will consider GS of coupled chaotic systems.
2 Generalized synchronization for two coupled chaotic systems
In this section we review generalized synchronization between two coupled chaotic systems. The concept of GS was introduced first for unidirectionally coupled systems, i.e. systems coupled in a drive response configuration, by Abarbanel et. al. Abarbanel1995 () (see Fig. 1(a)). Let us consider a dimensional system driving a dimensional system . The dynamics of the drive and the response systems can be written as
where and are the state variables of drive and response systems respectively, is the coupling parameter, and give the uncoupled dynamics of the drive and response systems respectively, and is the driving function. For a suitable driving function and sufficiently large coupling parameter , systems (1a) and (1b) can exhibit GS.
When the evolution of the response system is independent of the drive system. As the coupling parameter is increased the coupled systems are said to show generalized synchronization when there exists a map relating the state variables of the response to that of the drive systems, i.e.
The synchronization manifold is defined by the condition and the motion of the synchronized systems will collapse onto this synchronization manifold. For most cases it is difficult to determine the functional relation between the coupled systems. For GS this functional relation must be observed for the trajectories on the attractors, but not necessarily for the transient trajectories.
3 Detection of Generalized Synchronization
In this section we will discuss the schemes that have been developed to detect generalized synchronization.
3.1 Mutual False Nearest Neighbors (MFNN) method
In this section we briefly discuss the mutual false nearest neighbors (MFNN) method for detecting generalized synchronization Abarbanel1995 (). The main feature of this method is the concept of local neighborliness. In the generalized synchronized state the trajectories of the drive and response systems are connected by a functional relationship (2). This method depends on the observation that in the synchronized state, two neighboring points in the phase space of drive system correspond to two neighboring points in the phase space of response system.
Let us consider a time series of the drive variable and the corresponding time series of the response variable . From the time series the attractors of the drive and the response systems can be reconstructed using embedding methods Farmer1980 (); Abarbanel1992 (). Let the dimension of the drive and the response systems are and respectively and each are larger than the respective global embedding dimensions required to unfold the attractors. Choose an arbitrary point from the drive time series. Let the nearest phase space neighbor of this point in the time series be . In the generalized synchronization, we can expect that the corresponding points of the response system and are close. Using Eq. (2), the distance between these two points of the response system can be written as
As the difference is expected to be small, we can write Eq. (3) as,
where is the Jacobian matrix evaluated at .
Similarly, now we consider the point from the phase space of the response system and find its nearest neighbor from the time series as . Let the corresponding points of the drive system be and . Using Eq. (2), the distance between the points of the response variable can be written as
The MFNN parameter is defined as the ratio,
In the synchronized state the MFNN parameter will be order of unity.
3.2 Auxiliary System Approach (ASA)
Auxiliary system approach (ASA) is another way of detecting generalized synchronization between the drive system and the response system of Eq. (1) Abarbanel1996 (). In ASA, we consider another replica of the response system, , driven by the same system (see Fig. 1(b)) and the dynamics of this auxiliary system is given as
where is the coupling constant.
When system is not in generalized synchronization with system , then the trajectories of the response system and the auxiliary system will be unrelated. When and are in generalized synchronization with the relation , then there clearly exists a solution with . The stability of the synchronization manifold ensures that can track as . Thus, in the case of generalized synchronization the orbits of the response system and the auxiliary system tend to each other after the transients die out, i.e. .
It can be shown that the linear stability of the manifold is the same as the linear stability of the synchronization manifold . To see this let us consider the linearized equations in and as
where is the Jacobian matrix calculated at the synchronized solution . Since, the linearized equations for and are identical, the linearized equations for is also identical to them, i.e.
Therefore if the manifold of the generalized synchronized motion is stable then the manifold is linearly stable for and vice versa.
As an example, we consider two nonidentical chaotic Rössler systems coupled in drive response configuration. The drive Rössler system is
and the response Rössler systems is
where are Rössler parameters with . For this configuration GS can be observed for large coupling parameters. To test the ASA we make an auxiliary system of the response system and drive this replica in exactly the same way as the response system. In Fig. 2, the projection of the post transient phase space trajectory of the response Rössler system and the auxiliary system is plotted in the plane for (a) no synchronization and (b) GS.
3.3 Lyapunov Exponents Method
Here we briefly discuss the Lyapunov exponent method to analyze GS of coupled chaotic systems. The LE calculation is known to detect the GS boundary more precisely than MFNN and ASA methods. MFNN and ASA give mainly the qualitative confirmation of GS regime. The criteria for stable CS in unidirectionally coupled identical systems is that all the Lyapunov exponents which are transverse to the synchronization manifold are negative Vaidya1992 (). These Lyapunov exponents which are transverse to the synchronization manifold are known as the transverse Lyapunov exponents (TLE).
One can extend the same idea to analyze stability of GS for coupled nonidentical systems Parlitz1996 (). In this case, GS occurs if and only if all the transverse Lyapunov exponents are negative.
We consider the drive-response configuration given by Eq. (1). Here the dimension of the drive system is and the response system is . The behavior of these unidirectionally coupled system is characterized by the Lyapunov exponent spectrum of the entire system. For this configuration the drive systems will evolve independently thus its LE spectrum will not be affected by the response system. Let give the LE spectrum of the entire coupled system. Of these LEs it is easy to identify those corresponding to the drive system, since in the corresponding eigenvectors the components corresponding to the response system are zero. The remaining exponents are the transverse Lyapunov exponents (TLEs). The GS is stable when all the TLEs are negative or equivalently the largest TLE is negative.
Let us consider two unidirectionally coupled nonidentical Rössler systems as given in Eq. (11) and Eq. (12). In Fig. 3(a), the four largest Lyapunov exponents of the coupled systems are shown as a function of coupling parameter . For , the figure shows two positive and two zero exponents. As coupling parameter is increased first one zero exponent become negative at this point phase synchronization occurs between the coupled systems Rosenblum1996 (), with further increase in coupling parameter one positive exponent become negative at and at this point the coupled systems undergo generalized synchronization Parlitz1996 (). For comparison with ASA we plot the time average Euclidean distance, , between the response system and its auxiliary system as a function of the coupling parameter in Fig. 3(b) which shows that the distance between the response and the auxiliary system tends to zero at .
4 Generalized synchronization for mutually coupled systems
In the above section we have discussed the emergence and detection of GS for unidirectionally coupled systems. We have seen that in GS for unidirectionally coupled systems there exists a functional relation between the state variables of the drive and response systems, i.e. where is the drive system and is the response system. For the unidirectionally coupled systems the evolution of drive system does not depend on the evolution of response system, but for bidirectionally or mutually coupled systems the state variables of each system will depend on the state variables of the other system. So, for such a case the functional relation is to be modified to the form Cross2002 (); Moskalenko2012 (),
In Fig. 4(a) the schematic diagram of two mutually coupled systems and is shown. Let us consider the dynamics of two mutually coupled systems, and given by,
where and are the state variables of the coupled systems, are the coupling parameters and and are the coupling functions. For suitable coupling functions and coupling parameters the systems and will show GS.
4.1 ASA to detect GS for mutually coupled systems
In section 3.2, we have discussed the ASA to detect GS for unidirectionally coupled systems. For unidirectionally coupled systems, an auxiliary system of the response system is created by replicating the response system and this auxiliary system is driven with the same drive system. In the stable GS the Euclidean distance between the response system and the auxiliary system goes to zero.
For mutually coupled systems we need to consider the auxiliary system corresponding to each system and drive these auxiliary systems in the same way as it was done for the original copy. Fig. 4(b) shows the schematic diagram for ASA for two mutually coupled systems and . For Eq. (14) we consider the following auxiliary systems
where and are the state variables of the auxiliary systems. Let and give the time average Euclidean distances between the systems and and the systems and respectively. In the GS, the Euclidean distance between the auxiliary system and its original copy will go to zero.
Let us demonstrate the ASA method with the help of two mutually coupled nonidentical Rössler systems
In Fig. 5(a) the time average Euclidean distances and are plotted as a function of the coupling parameter . As the coupling parameter is increased the Euclidean distance between systems and goess to zero first, while the is still nonzero. At , goes to zero and at this point the coupled systems undergo generalized synchronization note-ws ().
4.2 LE method to analyze GS for mutually coupled systems
In section 3.3 we have discussed the LE method to detect GS for unidirectionally coupled systems. LE method provides the exact boundary of synchronization both for CS Pecora1990 (); Pecora1998 () and for GS Parlitz1996 (). In the synchronized state the largest transverse Lyapunov exponent (TLE) is negative. We use this criteria the analyze stability of the GS for coupled nonidentical systems.
Let us demonstrate the LE method to analyze GS for mutually coupled systems with the help coupled nonidentical Rössler systems, Eqs. (16) and (17). In Fig. 5(b) the four largest Lyapunov exponents are shown as a function of the coupling parameter . As is increased one zero exponent first become negative at , and the coupled systems are phase synchronized. When couping parameter is further increased, a positive Lyapunov exponent become negative at , and the coupled systems undergo generalized synchronization.
5 Generalized synchronization in networks
Recently, the studies of generalized synchronization in complex networks have received much attention Moskalenko2012 (); Hu2008 (); Guan2009 (). We consider the the following network of coupled systems
where is the state variable of system , gives the dynamics of an isolated system, gives the coupling function, is coupling parameter and is the coupling matrix. Here, we take to be the adjacency matrix, i.e. if the nodes and are coupled and zero otherwise. For suitable coupling function and coupling parameter values the coupled systems will show GS. In the stable GS, the state variables of the coupled systems will be related, thus we can write a generic function giving a functional relation between the state variables of all the coupled system as,
5.1 Auxiliary System Approach
Now, we discuss the auxiliary system approach (ASA) to detect GS between the coupled systems given in Eq. (18). The auxiliary system are created by replicating each coupled system and they are driven exactly in the same way as their original copy is driven. Let denote the auxiliary system for system , so we have
As an example, we consider a network of randomly coupled Rössler systems
The non-identity between the coupled Rössler systems is introduced through the parameter .
First, we consider the case when the coupled systems are identical, i.e. . Fig. 6(a) shows the Euclidean distances between each auxiliary system and its original, as a function of the coupling parameter for the case when all Rössler systems are identical. As the coupling parameter increases some distances go to zero while others remain nonzero. At all distances go to zero, indicating a transition to complete synchronization. A similar behavior is observed for coupled nonidentical systems. Fig. 6(b) shows the Euclidean distances between each auxiliary system and its original, as a function of the coupling parameter for the case when the Rössler systems are nonidentical. As the coupling parameter increases some distances go to zero and at all distances go to zero, indicating a transition to generalized synchronization.
5.2 Lyapunov Exponent method
For networks of coupled identical systems the stability of complete synchronization has been well analyzed. Pecora and Carroll (1998) Pecora1998 () introduced a master stability function (MSF) which can be calculated from a simple set of master stability equations. Using the master stability function one can calculate the largest transverse Lyapunov exponent for a network. For stable CS, the largest transverse Lyapunov exponent is negative.
5.3 Master Stability Function for coupled nearly-identical systems
In Ref. Acharyya2012 (), we extend the formalism of MSF to coupled nearly-identical systems. In this section we briefly review the analysis of MSF for coupled nearly-identical systems. We start by considering a network of coupled dynamical systems as
where is the -dimensional state vector of system and is the parameter which makes the systems nonidentical, and give respectively the dynamical evolution of a single system and the coupling function, is the coupling matrix and is the coupling constant. The coupling matrix when system couples with system , otherwise ; the diagonal element , where is the degree of system . So, the coupling matrix satisfies the condition which fulfills the condition for invariance of the synchronization manifold Pecora1998 (). Let the parameter , where is some typical value of the parameter and is a small mismatch.
When all coupled systems are identical, i.e. , the coupled systems exhibit complete synchronization for suitable coupling constant Pecora1990 (). For complete synchronization all state variables of the coupled systems become equal, i.e. and the motion of the coupled systems are confined to a subspace which is the synchronization manifold. The synchronized state is stable when all the transverse Lyapunov exponents are negative. For coupled identical systems the linearized equations can be obtained from Eq. (22) by expanding in Taylor’s series about the complete synchronized state, i.e. . These linearized equations can be diagonalized into modes Pecora1998 () and can written in the form
where is the differential operator and is the -th eigenvalue of coupling matrix . Eq. (23) is called as the master stability equation Pecora1998 (). The MSF is calculated as the largest Lyapunov exponent of Eq. (23) as a function of the parameter .
For coupled nonidentical systems, the synchronization will be of the generalized type, where the state variables of the coupled systems are related by a functional relationship Abarbanel1995 (). For coupled nonidentical systems, it is not possible to have a simple block diagonalized form as in Eq. (23). In Ref.Acharyya2012 () we have shown how one can achieve an approximate block diagonalized form similar to Eq. (23) for nearly-identical systems. Using this form we can obtain the master stability function and thus determine the stability of the generalized synchronization for nearly-identical systems.
For coupled identical systems the variational equations are obtained by expanding Eq. (22) around the synchronous solution , where the can be obtained by integrating an isolated system. Similar expansion is not possible when we consider the case of coupled nearly-identical systems, as now the synchronous solution is not the solution of an isolated system, but it is given by some functional relation between the state variables of coupled systems. One way of doing this expansion is to consider the average trajectory , and expand Eq. (22) around this average trajectory Sun2009 (). This needs the integration of all the systems and the computation increases for large networks. Other way of doing this is to expand Eq. (22) around the solution of an isolated system with some typical parameter . In the generalized synchronization, the state variables of the coupled systems are related with some functional relation. Hence, we can assume that the attractors of the coupled systems are not much different from each other so that the average rate of expansion and contraction for attractors of the coupled systems are also not very different. One choice of typical parameter value is the average value, i.e. , and it gives a good approximation to the Lyapunov exponents (see Figure 1 of Ref. Acharyya2012 ()). Thus, we expand Eq. (22) in Taylor’s series about the solution of an isolated systems with typical parameter value . We retain terms up-to second order and we get
where is the deviation of -th system from . In Eq. (24) we have dropped the term containing as we are interested in the solution . Eq. (24) contains both inhomogeneous and homogeneous terms. The exponential dependence of solutions of a linear differential equation are given by the homogeneous terms Acharyya2012 (); Sorrentino2011 (). So to calculate Lyapunov exponents from Eq. (24) we can drop the inhomogeneous terms to obtain
Eq. (25) can be put in matrix form as
where is the transpose of the coupling matrix and and . Now, we want to decouple Eq. (26) along the eigenvalues of the coupling matrix . Let be the eigenvalues of the coupling matrix and the corresponding left and right eigenvectors are and respectively. We note that there exists an eigenvalue of the coupling matrix and it defines the synchronization manifold and the rest of the eigenvalues define the transverse manifold. We multiply Eq. (26) by from right and use the -dimensional vector . Thus,
The generic variational equation or the master stability equation can be written by considering two complex parameters and as
The master stability function (MSF) is defined as the largest Lyapunov exponent as a function of the parameters and . The accuracy of the MSF for coupled nearly-identical systems is discussed in Ref Acharyya2012 ().
Now we demonstrate the MSF by considering coupled nearly-identical Rössler systems. The dynamics of a single Rössler system is given by
We consider that the systems are coupled in the component. Let us consider the simpler case of Rössler systems with mismatch in one parameter only. The zero contour curves of the master stability function for this case are given in Fig. 7 in the plane with (a),(b),(c), and (d) giving the zero contour curves when the mismatch is present in Rössler parameters and respectively. The MSF is negative in the region II border by the two zero contour curves and is positive in region I outside. We will refer to region I as the unstable region and region II as the stable region.
From Fig. 7(a) we can see that the stable region increases with increase in the parameter and from Fig. 7(b) that the stable region decreases with decrease in parameter . Figs. 7(c) and (d) show that the stability of the GS is almost unaffected for mismatch in parameters and .
Let us consider a specific network of coupled Rössler systems with coupling matrix and mismatch in one parameter. Determine the eigenvalues of . There is one eigenvalue which corresponds to the synchronization manifold. For the other eigenvalues, determine the pairs of parameters and . If for all these pairs of parameters the MSF lies in the stable region then the coupled systems are in stable GS.
6 Synchronized optimized networks
The MSF for coupled chaotic systems can be used to construct synchronized optimized networks. By synchronized optimized network we mean that the synchronization is stable for the widest possible interval in the coupling parameter. Let us consider a network of coupled nearly-identical Rössler systems with mismatch in parameter . From Fig. 7(a) we can see that the stability region increases with increase in the mismatch parameter . Now to construct synchronized optimized networks from this given network we rewire its link and we accept the new network if the stable interval increases, otherwise we accept the network with probability , where and are the stable intervals of the network before and after rewiring and is a temperature like quantity. The temperature is reduced after certain number of iterations so that simulated annealing occurs. We stop the optimization method when there is no change in the network for five successive temperature steps. At this point we assume that a good approximation of the optimal network has been achieved.
When we construct synchronized optimized networks from coupled nonidentical systems then there are additional questions such as which nodes become hubs of the network and which links are more preferred in the optimal network. To answer these questions we introduce the following correlations coefficients. We define the average correlation coefficient between the degree of a node and its parameter as
where is the degree of node and is its parameter. For a random network . Next, to find which links are more preferred in the optimal networks we define the average correlation between the parameter difference between a pair of nodes and their connection as
where is the parameter difference between node and node and is the adjacency matrix and . For a random network .
To see the uses of these correlation coefficients let us consider a network of nearly-coupled Rössler systems which have parameter mismatch in the Rössler parameter . From Fig. 7(b) we can see that for this configuration the stable region increase with decrease in the first order correction term . In Fig. 8(a) we plot as a function of Monte Carlo steps. We can see starts near zero value and then decreases and saturates to a negative value. This implies that the nodes with smaller value of parameter are likely to have higher degree in the optimal network and thus become the hubs of the optimal network. Fig. 8(b) shows the correlation coefficient as a function of Monte Carlo steps. increases from zero and saturates to a positive value. It implies that the pair of nodes which have higher parameter difference are preferred for creating links of the optimal network.
To conclude we have briefly discussed some recent developments in the study of GS. We discuss different methods of detecting GS such as mutual false nearest neighbors, auxiliary system approach and the Lyapunov exponents method. We have analyzed the stability of the GS for coupled nearly-identical systems with the help of MSF. Using these MSF we later discuss the problem of constructing synchronized optimized networks from a given network with fixed number of links and nodes. We rewire the links to achieve the optimal network. For the synchronized optimized networks we have found that the nodes with parameter value at one extreme are chosen as hubs and the pair of nodes with relatively large parameter difference are chosen to create links.
- (1) S. Strogatz, Sync: The Emerging Science of Spontaneous Order, Penguin Books, Limited (UK), (2008).
- (2) A. Pikovsky, M. Rosenblum, and J. Kurths, Synchronization: a universal concept in nonlinear sciences, Cambridge University Press, (2001).
- (3) S. Boccaletti, J. Kurths, G. Osipov, D.L. Valladares, and C.S. Zhou, Phys. Rep. 366 (2002) 1.
- (4) A. Arenas, A. D. -Guilera, J. Kurths, Y. Moreno, C. Zhou, Phys. Rep. 469 (2008) 93.
- (5) H. Fujisaka and T. Yamada, Prog. Theor. Phys. 69, (1983) 32.
- (6) V. S. Afraimovich, N. N. Verichev and M. I. Rabinovich, Radiophys. Quantum Electron. 29, (1986) 795.
- (7) L. M. Pecora and T. L. Carroll, Phys. Rev. Lett. 64, (1990) 821.
- (8) M. G. Rosenblum, A. S. Pikovsky, and J. Kurths, Phys. Rev. Lett. 76, (1996) 1804.
- (9) N. F. Rulkov, M. M. Sushchik, L. S. Tsimring, H. D. I. Abarbanel, Phys. Rev. E 51, (1995) 980.
- (10) N. H. Packard, J. P. Crutchfield, J. D. Farmer, and R. S. Shaw, Phys. Rev. Lett. 45, 712 (1980).
- (11) M. B. Kennel, R. Brown, and H. D. I. Abarbanel, Phys. Rev. A 45, 3403 (1992).
- (12) H. D. I. Abarbanel, N. F. Rulkov and M. M. Sushchik, Phys. Rev. E 53, (1996) 4528.
- (13) R. He and P. G. Vaidya, Phys. Rev. A 46, (1992) 7387.
- (14) L. Kocarev and U. Parlitz,Phys. Rev. Lett. 76, (1996) 1816.
- (15) Z. Zheng, X. Wang, and M. C. Cross, Phys. Rev. E 65, (2002) 056211.
- (16) O. I. Moskalenko, A. A. Koronovskii, A. E. Hramov, and S. Boccaletti,Phys. Rev. E 86, (2012) 036216.
- (17) Between the region when is zero, but is nonzero, we get weak generalized synchronization. We do not discuss the details of this phenomena here which can be found in K. Pyragas, Phys. Rev. E 54, (1996) R4508.
- (18) L. M. Pecora and T. L. Carroll, Phys. Rev. Lett. 80, (1998) 2109.
- (19) Y. -C. Hung, Y.-T. Huang, M. -C. Ho, and C. -K. Hu, Phys. Rev. E 77, (2008) 016202.
- (20) S. Guan, K. Li, and C.-H. Lai, Chaos 16, (2006) 023107.
- (21) J. Sun, E. M. Bollt and T. Nishikawa, Europhys. Lett. 85 (2009) 60011.
- (22) S. Acharyya and R. E. Amritkar, Europhys. Lett. 99 (2012) 40005.
- (23) F. Sorrentino and M. Porfiri, Europhys Lett. 93 (2011) 50002.