1 Introduction

Stability preservation in

[0.5ex] stochastic Galerkin projections

[1.5ex] of dynamical systems


Roland Pulch

[1ex] Institute of Mathematics and Computer Science,

Ernst-Moritz-Arndt-Universität Greifswald,

Walther-Rathenau-Str. 47, 17489 Greifswald, Germany.

Email: pulchr@uni-greifswald.de


Florian Augustin

[1ex] Massachusetts Institute of Technology

Cambridge, MA 02139

Email: fmaugust@mit.edu


Abstract

In uncertainty quantification, critical parameters of mathematical models are substituted by random variables. We consider dynamical systems composed of ordinary differential equations. The unknown solution is expanded into an orthogonal basis of the random space, e.g., the polynomial chaos expansions. A Galerkin method yields a numerical solution of the stochastic model. In the linear case, the Galerkin-projected system may be unstable, even though all realizations of the original system are asymptotically stable. We derive a basis transformation for the state variables in the original system, which guarantees a stable Galerkin-projected system. The transformation matrix is obtained from a symmetric decomposition of a solution of a Lyapunov equation. In the nonlinear case, we examine stationary solutions of the original system. Again the basis transformation preserves the asymptotic stability of the stationary solutions in the stochastic Galerkin projection. We present results of numerical computations for both a linear and a nonlinear test example.

Key words: dynamical system, orthogonal expansion, polynomial chaos, stochastic Galerkin method, asymptotic stability, Lyapunov equation.

1 Introduction

Uncertainty quantification (UQ) examines the dependence of outputs on vague input parameters in mathematical models, see [16]. Often the uncertain parameters are replaced by random variables or random processes, resulting in a stochastic problem. We consider dynamical systems consisting of ordinary differential equations (ODEs) with random parameters. The state variables can be expanded into a series of orthogonal basis functions, where often polynomials are applied (polynomial chaos), see [1, 3, 17]. Stochastic Galerkin methods or stochastic collocation techniques yield numerical solutions of unknown coefficient functions. We focus on the stochastic Galerkin approach, see [2, 8, 10], in this paper.

Sonday et al. [14] analyzed the spectrum of a Jacobian matrix of a Galerkin-projected (nonlinear) system of ODEs. The authors conclude that the Galerkin projection can be unstable even if the original system is asymptotically stable. Even though this loss of stability happens rather seldom, this change in stability leads to unexpected and erroneous results. Therefore, we derive a technique, which guarantees the asymptotic stability in the Galerkin-projected system provided that the original system is asymptotically stable.

Prajna [7] designed an approach to preserve stability in a projection-based model order reduction of a (nonlinear) system of ODEs. Therein, a dynamical system is reduced to a smaller dynamical system. We apply a similar strategy in the stochastic Galerkin method, where a random dynamical system is projected to a larger deterministic dynamical system. The resulting stability-preserving technique is based on a basis transformation of the original parameter-dependent system, where the transformation matrix is derived from the solution of a Lyapunov equation.

We construct and investigate this transformation for linear dynamical systems in detail. A proof of the stability preservation is given for the stabilized stochastic Galerkin method. Furthermore, we consider the asymptotic stability of stationary solutions (equilibria) for autonomous nonlinear dynamical systems. In [9], existence and convergence of stationary solutions was analyzed in the stochastic Galerkin-projected systems. Now we apply the basis transformation to guarantee the stability of the stationary solutions.

The paper is organized as follows. The stochastic Galerkin approach is described for the linear case in Section 2. The stability-preserving projection is derived and analyzed in Section 3. An analogous stabilization is specified for the nonlinear case in Section 4. Finally, Section 5 includes numerical results for both a linear and a nonlinear test example.

2 Problem definition

The class of linear problems under investigation is described in this section.

2.1 Linear dynamical systems and stability

We consider a linear dynamical system of the form

(1)

where the matrix and the vector depend on parameters for some subset . Consequently, the state variables are also parameter-dependent. Initial value problems are determined by

with a given function . Since we are investigating stability properties, let, without loss of generality, in the system (1).

To analyze the stability, we recall some general properties of matrices.

Definition 1

Let and be its eigenvalues. The spectral abscissa of the matrix  reads as

is called a stable matrix, if it holds that .

A linear dynamical system is asymptotically stable if and only if the included matrix  is stable. We assume that the matrices  in the system (1) are stable for all  in the following.

2.2 Stochastic modeling and orthogonal expansions

Now we assume that the parameters in equation (1) are affected by uncertainties. In an uncertainty quantification, the parameters are replaced by independent random variables on some probability space . Let a joint probability density function be given. Without loss of generality, we assume , because the parameter space  can be restricted to the support of  otherwise. For a measurable function , the expected value reads as

(2)

provided that the integral exists. The Hilbert space

is equipped with the inner product

Let a complete orthonormal system be given. Thus the basis functions satisfy

(3)

Assuming for each component and each time point , the state variables of the system (1) can be expanded into a series

(4)

The coefficient function are defined by

(5)

The series (4) converges in the norm of point-wise for each . Often polynomials are used as basis functions following the concepts of the generalized polynomial chaos (gPC). More details can be found in [17].

2.3 Galerkin projection of linear dynamical systems

Stochastic Galerkin methods and stochastic collocation techniques yield approximations of the coefficient functions (5) in the expansion (4), see [2, 8, 10, 11]. We apply the stochastic Galerkin approach, where Equation (1) is projected onto a finite subset of basis functions. The stochastic process (4) is approximated by a truncated expansion

The Galerkin projection of the dynamical system (1), neglecting the term , results in the larger linear dynamical system

(6)

whose solution represents an approximation of the exact coefficient functions (5). The matrix is defined by its minors with

using the matrix  from (1). Therein, the expected value, see (2), is applied componentwise. If the matrix is symmetric for almost all , then the matrix is also symmetric. Otherwise, the matrix is unsymmetric, which is the case in many situations.

The convergence properties of the stochastic Galerkin approach are not investigated in this paper. Alternatively, we examine the stability properties. The analysis in [14] shows that the matrix  may be unstable even though is stable for strictly all  with respect to Definition 1. Yet the stability is guaranteed in the case of normal matrices for all . Even though stability can be lost for non-normal matrices, the stochastic Galerkin method is still convergent on compact time intervals under usual assumptions.

2.4 Basis transformations

We consider a transformation of the linear dynamical system (1) to an equivalent system

(7)

with and transformation matrices being point-wise regular. It holds that

(8)

and for each . The operation (8) represents a similarity transformation, i.e., the spectra of the matrices and coincide.

If the stochastic Galerkin system (6) is unstable, then our aim is to identify a basis transformation given by a matrix such that the Galerkin projection of the dynamical system (7) yields a stable system. The following properties of the basis transformation are required:

  1. has to be non-constant in the variable . The Galerkin approach is invariant with respect to constant basis transformations and thus the stability properties cannot be changed.

  2. If it holds that , then is required to guarantee and thus for each . The convergence rate of orthogonal (gPC) expansions depends on the order of differentiability in the random-dependent functions, see [1, Sect. 2.1] and [17, Ch. 3.3.2].

3 Stability preservation

We derive a concept to guarantee the stability of the dynamical system resulting from the stochastic Galerkin projection.

3.1 General results

In this subsection, a constant matrix  is considered. If the matrix  is unsymmetric, then the following definition allows for further investigations.

Definition 2

The symmetric part of a matrix reads as

The symmetric part of  is negative definite if and only if is negative definite. We will apply the following well-known property later.

Lemma 1

If the symmetric part of is negative definite, then is a stable matrix.

Proof:

The spectral abscissa is bounded by for an arbitrary logarithmic norm . The logarithmic norm associated with the Euclidean vector norm reads as , see [6, Thm. 10.5]. The negative definiteness of the symmetric part implies . It follows that .

Assuming that the symmetric part of a given matrix  is not negative definite, we construct a transformation such that the symmetric part of the similarity-transformed matrix becomes negative definite.

Theorem 1

Let be a stable matrix and be a symmetric positive definite matrix. The Lyapunov equation

(9)

has a unique symmetric positive definite solution . For a symmetric decomposition with , a similarity transformation yields the matrix

(10)

which features a negative definite symmetric part.

Proof:

The stability of the matrix  guarantees that the Lyapunov equation (9) exhibits a unique solution . The symmetric matrix  is positive definite, because is positive definite. The symmetric part of the matrix (10) becomes (neglecting the factor )

The matrix is negative definite due to the positive-definiteness of ,

for all .

The proof of Theorem 1 follows mainly the steps in [7, Thm. 5]. However, a symmetric decomposition is assumed in [7], which requires the computation of all eigenvalues and eigenvectors. Furthermore, this decomposition is not unique in the case of multiple eigenvalues. Uniqueness of the decomposition of the matrix is essential for deriving a stability-preserving transformation for the stochastic Galerkin approach. Theorem 1 holds true for arbitrary symmetric decompositions of . In particular, we may apply the Cholesky factorization , where becomes a unique lower triangular matrix with strictly positive diagonal elements, see [15, Ch. 4.3]. Efficient algorithms are available to compute the Cholesky factor without first finding , see [5]. More details on Lyapunov equations can be found in [4], for example.

3.2 Stability-preserving transformation

The linear dynamical system (1) is assumed to involve stable matrices for all . In view of (9), we use the parameter-dependent Lyapunov equations

(11)

Let the matrices be symmetric positive definite for all . Constant choices are feasible. Consequently, the system (11) has a unique symmetric positive definite solution  for each . The following result guarantees the preservation of smoothness.

Lemma 2

If , then it follows that for the transformed matrix (8) using the Cholesky factorization of the solutions from the Lyapunov equations (11).

Proof:

Each Lyapunov equation (11) represents a larger linear system of algebraic equations. Assuming , Cramer’s rule implies that the entries in the solution  are also in . The Cholesky decomposition defines a continuous mapping from the set of symmetric positive definite matrices to the set of lower triangular matrices with strictly positive diagonal, see [12, Sect. 12.1.3]. Likewise, the smoothness can be shown for the Cholesky factor . The regularity implies . Now formula (8) demonstrates that .

In particular, Lemma 2 is valid in the case of continuous functions (). Furthermore, just measurable matrix-valued functions imply measurable matrix-valued functions .

If the entries of and are polynomials in the variable , then the entries of become rational functions. However, the entries of the factor  in a symmetric decomposition are not rational functions in general, because a square root is typically applied somewhere. Consequently, the transformed matrix (10) typically includes irrational functions.

We transform the original dynamical system (1) into (7) using the transformation matrix for an arbitrary symmetric decomposition . The main result is formulated now.

Theorem 2

Let with stable and symmetric as well as positive definite for almost all . The Lyapunov equation (11) yields a unique solution for almost all . Let a symmetric decomposition be given for almost all  satisfying . Using , the Galerkin projection of the transformed system (7) with the matrix (8) results in an asymptotically stable linear dynamical system.

Proof:

The assumptions imply that for the transformed matrix (8). The stochastic Galerkin approach applied to the transformed system (7) including the matrix (8) yields a dynamical system with the matrix . The minors read as for . We investigate the symmetric part of the matrix . The minors of the symmetric part become

Given the transformation (8) with , Theorem 1 shows that the symmetric part, , of the matrix (10) is negative definite for almost all . Let with . It follows that

Since the basis functions are linearly independent, it holds that

Assuming , the function  is non-zero on a subset with for the probability measure . It follows that the above expected value becomes strictly negative. Hence the symmetric part is negative definite. Lemma 1 shows that is a stable matrix. Consequently, the dynamical system is asymptotically stable.

The above result is independent of the choice of orthogonal basis functions. Moreover, the system of basis functions is not required to be complete.

4 Nonlinear dynamical systems

We derive a stabilization for nonlinear dynamical systems now.

4.1 Stationary solutions

Let a nonlinear autonomous dynamical system

(12)

be given, including a sufficiently smooth function (). We assume that a family of asymptotically stable stationary solutions exists, i.e.,

The asymptotic stability means that the matrix is stable for all , cf. [13, Thm. 1.6].

The Galerkin projection of the nonlinear system (12) reads as

(13)

with the right-hand side

(14)

for , where the expected value is applied component-wise again. However, the existence of an equilibrium of the larger system (13) with the right-hand side (14) is not guaranteed in general. In [9], sufficient conditions are specified, under which there exists a stationary solution of (13) for a sufficiently high polynomial degree in gPC expansions. Moreover, if (13) has a stationary solution , then the function

(15)

is not an equilibrium of the original system (12) in general. Thus we cannot conclude the stability of an equilibrium of (13) from the stability of satisfying (12) directly.

4.2 Stabilization of stationary solutions

Instead of arguing about the stability of an arbitrary equilibrium, we transform the system (12) into

(16)

using the family of stationary solutions. It follows that zero represents an asymptotically stable stationary solution of the transformed system (16) for all . Let

(17)

using the shifted function from (16) and . Now is also a stationary solution of the Galerkin-projected system

(18)

with the right-hand side (17). Given a solution of (18), an approximation a solution of the original system (12) reads as

assuming a convergent expansion

of the original stationary solutions.

If the equilibrium of the Galerkin-projected system (18) is unstable, then a stabilized system can be constructed as in Section 3.2. We apply the transformation to the parameter-dependent matrix

(19)

The stabilized system is given by

(20)

using a symmetric decomposition of the solution of the Lyapunov equations (11) including the parameter-dependent Jacobian matrix (19). However, an evaluation of the function  in the system (20) requires the computation of the stationary solution, , of (12) for a given parameter value. Note that the Jacobian matrix of (20) at the stationary solution is

Thus, Section 3.2 yields that the spectrum of the symmetric part of the original Jacobian matrix is changed appropriately to preserve stability. The Galerkin projection of the dynamical system (20) results in a larger dynamical system, whose equilibrium is guaranteed to be asymptotically stable.

5 Illustrative examples

We investigate a linear dynamical system as well as a nonlinear dynamical system with respect to the stability properties in the Galerkin-projected system.

5.1 Linear dynamical system

We consider the linear dynamical system (1) including the matrix

(21)

with a real parameter . The eigenvalues of this matrix have a negative real part for all . Thus the matrix (21) is stable in view of Definition 1.

In the stochastic model, we assume a uniform distribution for . The expansion (4) includes the Legendre polynomials up to degree . We use Gauss-Legendre quadrature, see [15, Ch. 3.6], to compute the matrices in the linear dynamical systems (6) of the stochastic Galerkin method. This computation is exact, except for round-off errors, because the entries of the matrix (21) represent polynomials in . However, the Galerkin projection always generates an unstable system. Figure 1 illustrates the spectral abscissae of the matrices  for .

Now we use the transformation from Section 3.2 to obtain a stable system. In the Lyapunov equation (11), we choose the constant matrix , the identity matrix in , which is obviously symmetric and positive definite. The unique solution of the Lyapunov equation has entries, which represent rational functions in the variable  with numerator/denominator polynomials of degrees up to ten. The Cholesky algorithm yields the decomposition with the unique factor. Hence the transformed matrix (10) can be computed point-wise for .

The Galerkin projection of the matrix in the transformed system (7) is computed numerically by a Gauss-Legendre quadrature with 20 nodes. Thus the matrix (10) is evaluated at each node of the quadrature. Figure 1 shows the spectral abscissae of the Galerkin-projected matrix for polynomial degrees . We recognize that all spectral abscissae are strictly negative, which confirms the asymptotic stability.

Furthermore, we examine the behavior of all eigenvalues in the Galerkin-projected systems. Figure 2 depicts the real part of the eigenvalues for both the original system and the stabilized system for different polynomial degrees. Since complex conjugate eigenvalues arise, some real parts coincide for each matrix. We observe that the eigenvalues behave similar and thus just the stabilization represents the crucial difference.

Figure 1: Spectral abscissae in linear dynamical systems, including matrix (21), from stochastic Galerkin method for different polynomial degrees, using a uniform distribution.
Figure 2: Eigenvalues of the original system, including matrix (21), (left) and the stabilized system (right) for different degrees of the polynomial expansion with uniform distribution.

Alternatively, we choose a beta distribution in the stochastic modeling. The probability density function reads as

with a constant for standardization. We select , . Jacobi polynomials yield an orthogonal basis. Again the stochastic Galerkin method results in unstable systems for all polynomial degrees. The Galerkin projection of the matrices of the original system and the stabilized system are computed by Gauss-Jacobi quadrature with 20 nodes. The spectral abscissae of the Galerkin-projected matrices for the polynomial degrees are depicted in Figure 3. The transformation yields again a stabilization of the critical systems.

Figure 3: Spectral abscissae in linear dynamical systems, including matrix (21), after the Galerkin projection for different polynomial degrees, using a beta distribution.

5.2 Nonlinear dynamical system

We now consider a two-dimensional dynamical system (12) with the quadratic right-hand side

(22)

and a real parameter . This system is chosen such that

(23)

is a stationary solution for all . Numerical computations confirm that these equilibria are asymptotically stable for all . Since the stationary solution (23) includes trigonometric functions, an exact representation in terms of polynomials in the variable  is not feasible.

Again, we assume the parameter to be a uniformly distributed random variable with the range . Consequently, the gPC expansion uses the Legendre polynomials. The stochastic Galerkin method yields the nonlinear dynamical system (13). In the right-hand side (14), we evaluate the probabilistic integrals approximately by a Gauss-Legendre quadrature using 20 nodes.

Numerical computations show that the Galerkin-projected system (13) exhibits stationary solutions for all degrees . Therein, the respective nonlinear systems are solved successfully by Newton iterations. The corresponding stationary solutions yield the functions (15), which represent approximations of the equilibrium in the original dynamical system. Figure 4 illustrates the functions (15) for the polynomial degrees . The approximations converge rapidly to sine and cosine, respectively, because these trigonometric terms are analytic functions in the variable . However, the stationary solutions of the Galerkin-projected systems (13) for all polynomial degrees are unstable, which can be seen by the spectral abscissae of the Jacobian matrices in Figure 5 (left).

Figure 4: Approximations associated to stationary solutions of the Galerkin-projected system, using the right-hand side function (22), for different polynomial degrees in quadratic problem.
Figure 5: Spectral abscissae for the Jacobian matrices associated with the stationary solutions of the Galerkin projections for the original system, the shifted system and the stabilized system with right-hand side function (22) in the quadratic problem.

To stabilize the computation, we change to the shifted nonlinear dynamical system (16) and its Galerkin-projected system (18). The stationary solutions are still unstable for all polynomial degrees, which is illustrated by the spectral abscissae of the associated Jacobian matrices in Figure 5 (left). The values for the original Galerkin system and the novel Galerkin system become closer for higher polynomial degrees in agreement to the convergence results in [9].

Now we apply the stabilization technique from Section 4.2. In the Lyapunov equations (11), the matrix , the identity matrix in , is selected. The Cholesky algorithm yields a decomposition of the solutions. This procedure has to be done for each node of the Gauss-Legendre quadrature. The stochastic Galerkin method projects the transformed system (20) to a larger system (18). Figure 5 (right) shows the spectral abscissae of the Jacobian matrices for the stationary solution zero for different polynomial degrees . It follows that the equilibria are asymptotically stable now.

Finally, we illustrate solutions of initial value problems computed using the stochastic Galerkin method for the original, shifted and stabilized right-hand side function. We choose the polynomial degree equal to three, which results in eight coefficient functions. The trapezoidal rule yields the numerical solutions. Firstly, the original Galerkin-projected system (13) is solved, whose initial values are selected close to its stationary solution. Figure 6 (top) illustrates the numerical solution. The trajectories are nearly constant at the beginning, say . Later the solution changes from the unstable equilibrium to some stable equilibrium. Secondly, we solve the Galerkin-projected system (13) with the shifted right-hand side function (16) using initial values close to the equilibrium . The same behavior appears like before, as depicted in Figure 6 (center). Thirdly, the Galerkin projection of the stabilized system (20) is solved, where the initial values are set to . Figure 6 (bottom) shows the numerical solution. Now the trajectories tend to the stable stationary solution .

Figure 6: Solutions of initial value problems for the Galerkin projections of the original system, the shifted system and the stabilized system in quadratic test example.

6 Conclusions

A basis transformation was constructed for linear dynamical systems including random variables. We proved that the stability properties are preserved in a Galerkin projection of the transformed system. The transformation matrix follows from a symmetric decomposition of a solution of a Lyapunov equation. We showed that the Cholesky factorization retains the smoothness of involved functions, while the computational effort is low in comparison to eigenvalue decompositions. Moreover, the transformation can be applied to guarantee the stability properties for stationary solutions in nonlinear dynamical systems. We performed numerical computations for test examples. The results demonstrate that the combination of the stochastic Galerkin method and the basis transformation yields an efficient numerical technique to preserve stability of the original system under the Galerkin projection.

References

  • [1] F. Augustin, A. Gilg, M. Paffrath, P. Rentrop, U. Wever, Polynomial chaos for the approximation of uncertainties: chances and limits, Euro. Jnl. of Applied Mathematics 19 (2008) 149–190.
  • [2] F. Augustin, P. Rentrop, Stochastic Galerkin techniques for random ordinary differential equations, Numer. Math. 122 (2012) 399–419.
  • [3] O.G. Ernst, A. Mugler, H.J. Starkloff, E. Ullmann, On the convergence of generalized polynomial chaos expansions, ESAIM: Mathematical Modelling and Numerical Analysis 46 (2012) 317–339.
  • [4] Z. Gajić, M.T.J. Qureshi, Lyapunov matrix equation in system stability and control, Dover Publications, Inc., 1995.
  • [5] S.J. Hammarling, Numerical solution of stable non-negative definite Lyapunov equation, IMA J. Numer. Anal. 2 (1982) 303–323.
  • [6] E. Hairer, S.P. Nørsett, G. Wanner, Solving Ordinary Differential Equations. Vol. 1: Nonstiff Problems, 2nd ed., Springer, Berlin 1993.
  • [7] S. Prajna, POD model reduction with stability guarantee, Proceedings of 42nd IEEE Conference on Decision and Control, Maui, Hawaii, USA, December 2003, pp. 5254–5258.
  • [8] R. Pulch, Polynomial chaos for linear differential algebraic equations with random parameters, Int. J. Uncertain. Quantif. 1 (2011) 223–240.
  • [9] R. Pulch, Stochastic Galerkin methods for analyzing equilibria of random dynamical systems, SIAM/ASA J. Uncertainty Quantification 1 (2013) 408–430.
  • [10] R. Pulch, Stochastic collocation and stochastic Galerkin methods for linear differential algebraic equations, J. Comput. Appl. Math. 262 (2014) 281–291.
  • [11] R. Pulch, E.J.W. ter Maten, F. Augustin, Sensitivity analysis and model order reduction for random linear dynamical systems, Math. Comput. Simulat. 111 (2015) 80–95.
  • [12] M. Schatzman, Numerical Analysis: A Mathematical Introduction, Clarendon Press, Oxford, 2002.
  • [13] R. Seydel, Practical Bifurcation and Stability Analysis, 3rd ed., Springer, 2010.
  • [14] B. Sonday, R. Berry, B. Debusschere, H. Najm, Eigenvalues of the Jacobian of a Galerkin-projected uncertain ODE system, SIAM J. Sci. Comput. 33 (2011) 1212–1233.
  • [15] J. Stoer, R. Bulirsch, Introduction to Numerical Analysis, 3rd ed., Springer, New York, 2002.
  • [16] T.J. Sullivan, Introduction to Uncertainty Quantification, Springer, 2015.
  • [17] D. Xiu, Numerical methods for stochastic computations: a spectral method approach, Princeton University Press, 2010.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
15561
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description