SOSConvex Lyapunov Functions and
Stability of Difference Inclusions
Abstract
We introduce the concept of sosconvex Lyapunov functions for stability analysis of both linear and nonlinear difference inclusions (also known as discretetime switched systems). These are polynomial Lyapunov functions that have an algebraic certificate of convexity and that can be efficiently found via semidefinite programming. We prove that sosconvex Lyapunov functions are universal (i.e., necessary and sufficient) for stability analysis of switched linear systems. We show via an explicit example however that the minimum degree of a convex polynomial Lyapunov function can be arbitrarily higher than a nonconvex polynomial Lyapunov function. In the case of switched nonlinear systems, we prove that existence of a common nonconvex Lyapunov function does not imply stability, but existence of a common convex Lyapunov function does. We then provide a semidefinite programmingbased procedure for computing a fulldimensional subset of the region of attraction of equilibrium points of switched polynomial systems, under the condition that their linearization be stable. We conclude by showing that our semidefinite program can be extended to search for Lyapunov functions that are pointwise maxima of sosconvex polynomials.
Key words. Difference inclusions, switched systems, nonlinear dynamics, convex Lyapunov functions, algebraic methods in optimization, semidefinite programming.
1 Introduction
The most commonly used Lyapunov functions in control theory, namely the quadratic ones, are convex functions. This convexity property is not always purposefully sought after; it is simply an artifact of the nonnegativity requirement of Lyapunov functions, which for quadratic forms coincides with convexity. If one however seeks Lyapunov functions that are polynomial functions of degree larger than two (for instance, for improving some sort of performance metric), then convexity is no longer implied by the nonnegativity requirement of the Lyapunov function (consider, e.g., the polynomial ). In this paper we ask the following question: what is there to gain (or to lose) by requiring that a polynomial Lyapunov function be convex? We also present a computational methodology, based on semidefinite programming, for automatically searching for convex polynomial Lyapunov functions.
Our study of this question is motivated by, and for the purposes of this paper exclusively focused on, the stability problem for difference inclusions, also known as discrete time switched systems. We are concerned with an uncertain and timevarying map
\hb@xt@.01(1.1) 
where
\hb@xt@.01(1.2) 
Here, are different (possibly nonlinear) continuous maps with , and denotes the convex hull operation. The question of interest is (local or global) asymptotic stability under arbitrary switching. This means that we would like to know whether the origin is stable in the sense of Lyapunov (see [29] for a definition) and attracts all initial conditions (either in a neighborhood or globally) for all possible values that can take at each time step .
The special case of this problem where the maps are linear has been and continues to be the subject of intense study in the control community, as well as in the mathematics and computer science communities [19, 37, 15, 48, 25, 30, 11, 31]. A switched linear system in this setting is given by
\hb@xt@.01(1.3) 
where are real matrices. Local (or equivalently global) asymptotic stability under arbitrary switching of this system is equivalent to the joint spectral radius of these matrices being strictly less than one.
Definition 1.1 (Joint Spectral Radius (JSR) [46])
The joint spectral radius of a set of matrices is defined as
\hb@xt@.01(1.4) 
where is any matrix norm on
Deciding whether is notoriously difficult. No finite time procedure for this purpose is known to date, and the related problems of testing whether or whether the trajectories of (LABEL:eq:switched.linear.system) are bounded under arbitrary switching are known to be undecidable [50]. On the positive side however, a large number of sufficient conditions for stability of such systems are known. Most of these conditions are based on the numerical construction of special classes of Lyapunov functions, a subset of which enjoy theoretical guarantees in terms of their quality of approximation of the joint spectral radius [22, 10, 41, 43, 27].
It is well known that if the switched linear system (LABEL:eq:switched.linear.system) is stable^{1}^{1}1Throughout this paper, by the word “stable” we mean asymptotically stable under arbitrary switching., then it admits a common convex Lyapunov function, in fact a norm [25]. It is also known that stable switched linear systems admit a common polynomial Lyapunov function [41]. It is therefore natural to ask whether existence of a common convex polynomial Lyapunov function is also necessary for stability. One would in addition want to know how the degree of such convex polynomial Lyapunov function compares with the degree of a nonconvex polynomial Lyapunov function. We address both of these questions in this paper.
It is not difficult to show (see [25, Proposition 1.8]) that stability of the linear inclusion (LABEL:eq:switched.linear.system) is equivalent to stability of its “corners”; i.e. to stability of a switched system that at each time step applies one of the matrices , but never a matrix strictly inside their convex hull. This statement is no longer true for the switched nonlinear system in (LABEL:eq:switched.nonlinear.system)(LABEL:eq:ftilda=conv); see Example LABEL:ex:nonconvex.fails in Section LABEL:subsec:nl.global of this paper. It turns out, however, that one can still prove switched stability of the entire convex hull by finding a common convex Lyapunov function for the corner systems . This is demonstrated in our Proposition LABEL:prop:convex.lyap and Example LABEL:ex:convex.lyap, where we demonstrate that convexity of the Lyapunov function is important in such a setting.
Such considerations motivate us to seek efficient algorithms that can automatically search over all candidate convex polynomial Lyapunov functions of a given degree. This task, however, is unfortunately intractable even when one restricts attention to quartic (i.e., degreefour) Lyapunov functions and switched linear systems. See our discussion in Section LABEL:section:sosconvex. In order to cope with this issue, we introduce the class of sosconvex Lyapunov functions (see Definition LABEL:def:sosconvex). Roughly speaking, these Lyapunov functions constitute a subset of convex polynomial Lyapunov functions whose convexity is certified through an explicit algebraic identity. One can search over sosconvex Lyapunov functions of a given degree by solving a single semidefinite program whose size is polynomial in the description size of the input dynamical system. The methodology can directly handle the linear switched system in (LABEL:eq:switched.linear.system) or its nonlinear counterpart in (LABEL:eq:switched.nonlinear.system)(LABEL:eq:ftilda=conv), if the maps are polynomial functions.^{2}^{2}2While polynomial dynamical systems are already a broad and significant class of nonlinear dynamical systems, certain extensions are possible. For example, our methodology extends in a straightforward fashion to the case where the functions are rational functions with signdefinite denominators. Extensions to trigonometric dynamical systems may also be possible using the ideas in [36].
We will review some results from the thesis of the first author which show that for certain dimensions and degrees, the set of convex and sosconvex Lyapunov functions coincide. In fact, in relatively low dimensions and degrees, it is quite challenging to find convex polynomials that are not sosconvex [7]. This is evidence for the strength of this semidefinite relaxation and is encouraging from an application viewpoint. Nevertheless, since sosconvex polynomials are in general a strict subset of the convex ones, a more refined (and perhaps more computationally relevant) converse Lyapunov question for switched linear systems is to see whether their stability guarantees existence of an sosconvex Lyapunov function. This question is also addressed in this paper.
We shall remark that there are other classes of convex Lyapunov functions whose construction is amenable to convex optimization. The main examples include polytopic Lyapunov functions, and piecewise quadratic Lyapunov functions that are a geometric combination of several quadratics [23, 44, 28, 43, 14, 20, 30]. These Lyapunov functions are mostly studied for the case of linear switched systems, where they are known to be necessary and sufficient for stability. The extension of their applicability to polynomial switched systems should be possible via the sum of squares relaxation. Our focus in this paper however is solely on studying the power of sosconvex polynomial Lyapunov functions. Only in our last section, do we briefly comment on extensions to piecewise sosconvex Lyapunov functions.
1.1 Related work
The literature on stability of switched systems is too extensive for us to review. We simply refer the interested reader to [48, 21, 25] and the references therein. Closer to the specific focus of this paper is the work of Mason et al. [35], where the authors prove existence of polynomial Lyapunov functions for switched linear systems in continuous time. Our proof of the analogous statement in discrete time closely follows theirs. In [9], Ahmadi and Parrilo show that in the continuous time case, existence of the Lyapunov function of Mason et al. further implies existence of a Lyapunov function that can be found with sum of squares techniques. In [41], Parrilo and Jadbabaie prove that stable switched linear systems in discrete time always admit a (not necessarily convex) polynomial Lyapunov function which can be found with sum of squares techniques. Blanchini and Franco show in [12] that in contrast to the case of uncontrolled switching (our setting), controlled linear switched systems, both in discrete and continuous time, can be stabilized by means of a suitable switching strategy without necessarily admitting a convex Lyapunov function.
In [18], [17], Chesi and Hung motivate several interesting applications of working with convex Lyapunov functions or Lyapunov functions with convex sublevel sets. These include establishing more regular behavior of the trajectories, ease of optimization over sublevel sets of the Lyapunov function, stability of recurrent neural networks, etc. The authors in fact propose sum of squares based conditions for imposing convexity of polynomials. However, it is shown in [6, Sect. 4] that these conditions lead to semidefinite programs of larger size than those of sosconvexity, while at the same time being at least as conservative. Moreover, the works in [18], [17] do not offer an analysis of the performance (existence) of convex Lyapunov functions.
On the optimization side, the reader interested in knowing more about sosconvex polynomials, their role in convex algebraic geometry and polynomial optimization, and their applications outside of control is referred to the works by Ahmadi and Parrilo [7], [8], Helton and Nie [24], and Magnani et al. [34], or to Section 3.3.3 of the edited volume [13]. Finally, we note that a shorter version of the current paper with some preliminary results appears in [2] as a conference paper.
1.2 Organization and contributions of the paper
The paper is organized as follows. In Section LABEL:section:sosconvex, we present the mathematical and algorithmic machinery for working with sosconvex Lyapunov functions and explain its connection to semidefinite programming. In Section LABEL:section:linear, we study switched linear systems. We show that given any homogeneous Lyapunov function, the Minkowski norm defined by the convex hull of its sublevel set is also a valid (convex) Lyapunov function (Proposition LABEL:prop:Minkowski). We then show that any stable switched linear system admits a convex polynomial Lyapunov function (Theorem LABEL:thm:existenceconvexpoly). Furthermore, we give algebraic arguments to strengthen this result and prove existence of an sosconvex Lyapunov function (Theorem LABEL:thm:existencesosconvexpoly). While existence of a convex polynomial Lyapunov functions is always guaranteed, we prove that in worst case, the degree of such a Lyapunov function can be arbitrarily higher than that of a nonconvex polynomial Lyapunov function (Theorem LABEL:thm:degree.higher).
In Section LABEL:section:nonlinear, we study nonlinear switched systems. We show that stability of these systems cannot be inferred from the existence of a common Lyapunov function for the corner systems (Example LABEL:ex:nonconvex.fails). However, we prove that this conclusion can be made if the common Lyapunov function is convex (Proposition LABEL:prop:convex.lyap). We also give a lemma that shows that the radial unboundedness requirement of a Lyapunov function is implied by its convexity (Lemma LABEL:lem:convex.coercive). We then provide an algorithm based on semidefinite programming that under mild conditions finds a fulldimensional inner approximation to the region of attraction of a locally stable equilibrium point of a polynomial switched system (Theorem LABEL:thm:nl.beta.guarantee). This algorithm is based on a search for an sosconvex polynomial whose sublevel set is proven to be in the region of attraction via a sum of squares certificate coming from Stengle’s Positivstellensatz. Some examples are provided in Section LABEL:subsec:examples.
Finally, in Section LABEL:sec:future, we briefly describe some future directions and extensions of our framework to a broader class of convex Lyapunov functions that are constructed from combining several sosconvex polynomials. These extensions are still amenable to semidefinite programming and have connections to the theory of pathcomplete graph Lyapunov functions proposed in [4].
2 Sosconvex polynomials
A multivariate polynomial is said to be nonnegative or positive semidefinite (psd) if for all . We say that is a sum of squares (sos) if it can be written as , where each is a polynomial. It is well known that if is of even degree four or larger, then testing nonnegativity is NPhard, while testing existence of a sum of squares decomposition, which provides a sufficient condition and an algebraic certificate for nonnegativity, can be done by solving a polynomiallysized semidefinite program [39],[40].
A polynomial is convex if its Hessian (i.e., the polynomial matrix of the second derivatives) is a positive semidefinite matrix for all . This is equivalent to the scalarvalued polynomial in variables being nonnegative. It has been shown in [5] that testing if a polynomial of degree four is convex is NPhard in the strong sense. This motivates the algebraic notion of sosconvexity, which can be checked with semidefinite programming and provides a sufficient condition for convexity.
Definition 2.1
A polynomial is sosconvex if its Hessian can be factored as
where is a polynomial matrix; i.e., a matrix with polynomial entries.
Polynomial matrices which admit a decomposition as above are called sos matrices. The term sosconvex was coined in a seminal paper of Helton and Nie [24]. The following theorem is an algebraic analogue of a classical theorem in convex analysis and provides equivalent characterizations of sosconvexity.
Theorem 2.2 (Ahmadi and Parrilo [8])
Let be a polynomial of degree in variables with its gradient and Hessian denoted respectively by and . Let , , and be defined as
\hb@xt@.01(2.1) 
Then the following are equivalent to sosconvexity of :
(a) is sos^{3}^{3}3The constant in of condition (a) is arbitrary and chosen for convenience. One can show that being sos implies that is sos for any fixed . Conversely, if is sos for some , then is sos..
(b) is sos.
(c) is sos.
The above theorem is reassuring in the sense that it demonstrates the invariance of the definition of sosconvexity with respect to the characterization of convexity that one may choose to apply the sos relaxation to. Since existence of an sos decomposition can be checked via semidefinite programming (SDP), any of the three equivalent conditions above, and hence sosconvexity of a polynomial, can also be checked by SDP. Even though the polynomials , , above are all in variables and have degree , the structure of the polynomial allows for much smaller SDPs (see [6] for details).
In general, finding examples of convex polynomials that are not sosconvex seems to be a nontrivial task, though a number of such constructions are known [7]. A complete characterization of the dimensions and the degrees for which the notions of convexity and sosconvexity coincide is available in [8].
Crucial for our purposes is the fact that semidefinite programming allows us to not only check if a given polynomial is sosconvex, but also search and optimize over the set of sosconvex polynomials of a given degree. This feature enables an automated search over a subset of convex polynomial Lyapunov functions. Of course, a Lyapunov function also needs to satisfy other constraints, namely positivity and monotonic decrease along trajectories. Following the standard approach, we replace the inequalities underlying these constraints with their sum of squares counterparts as well.
Throughout this paper, what we mean by an sosconvex Lyapunov function is a polynomial function which is sosconvex and satisfies all other required Lyapunov inequalities with sos certificates.^{4}^{4}4Even though an sos decomposition in general merely guarantees nonnegativity of a polynomial, sos decompositions obtained numerically from interior point methods generically provide proofs of its positivity; see the discussion in [1, p.41]. In this paper, whenever we are concerned with asymptotic stability and prove a result about existence of a Lyapunov function satisfying certain sos conditions, we make sure that the resulting inequalities are strict (cf. Theorem LABEL:thm:existencesosconvexpoly). When the Lyapunov function can be taken to be homogeneous—as is the case when the dynamics are homogeneous [45]—then the following lemma establishes that the convexity requirement of the polynomial automatically meets its nonnegativity requirement.
Recall that a homogeneous polynomial (or a form) is a polynomial whose monomials all have the same degree.
Lemma 2.3
Convex forms are nonnegative and sosconvex forms are sos.
For stability analysis of the switched linear system in (LABEL:eq:switched.linear.system), the requirements of a (common) sosconvex Lyapunov function are therefore the following:
\hb@xt@.01(2.2) 
Given a set of matrices with rational entries, the search for the coefficients of a fixeddegree polynomial satisfying the above conditions amounts to solving an SDP whose size is polynomial in the bit size of the matrices. If this SDP is (strictly) feasible, the switched system in (LABEL:eq:switched.linear.system) is stable under arbitrary switching. We remark that the same implication is true if the sosconvexity requirement of is replaced with the requirement that is sos; see [41, Thm. 2.2]. (This statement fails to hold for switched nonlinear systems.)
In the next section, we study the converse question of existence of a Lyapunov function satisfying the semidefinite conditions in (LABEL:eq:sosconvex.Lyap.requirements).
3 Sosconvex Lyapunov functions and switched linear systems
As remarked in the introduction, it is known that asymptotic stability of a switched linear system under arbitrary switching implies existence of a common convex Lyapunov function, as well as existence of a common polynomial Lyapunov function. In this section, we show that this stability property in fact implies existence of a common Lyapunov function that is both convex and polynomial (cf. Subsection LABEL:subsec:existence.convex). Moreover, we strengthen this result to show existence of a common sosconvex Lyapunov function (cf. Subsection LABEL:subsec:sosconvex).
Before we prove these results, we state a related proposition which shows that in the particular case of switched linear systems, any common Lyapunov function (e.g. a nonconvex polynomial) can be turned into a common convex Lyapunov function, although not necessarily an efficiently computable one. We believe that this statement must be known, but since we could not pinpoint a reference, we include a proof here.
Proposition 3.1
Consider the switched linear system in (LABEL:eq:switched.linear.system). Suppose is a common homogeneous and continuous Lyapunov function for (LABEL:eq:switched.linear.system); i.e. satisfies and Let
Then, the Minkowski (a.k.a. gauge) norm defined by the set , i.e. the function
is a convex common Lyapunov function for (LABEL:eq:switched.linear.system).
Proof. Since under the assumptions of the proposition the set is compact, origin symmetric, and has nonempty interior, the function is a norm (see e.g. [16, p. 119]) and hence convexity and positivity of are already established. It remains to show that for any and we have
To see the inequality, first note that because is a common Lyapunov function, there must exist a constant such that if , then . Now observe that if for some we have then by definition for some and with Hence,
for some . But this means that
3.1 Existence of convex polynomial Lyapunov functions
The goal of this subsection is to prove the following theorem.
Theorem 3.2
If the switched linear system (LABEL:eq:switched.linear.system) is asymptotically stable under arbitrary switching, then there exists a convex positive definite homogeneous polynomial that satisfies for all and all
Our proof is inspired by [35], which proves the existence of a convex polynomial Lyapunov function for continuous time switched systems, but we are not aware of an equivalent statement in discrete time. We will also need the following classical result, which to the best of our knowledge first appears in [46].
Theorem 3.3 (see [46, 25])
Consider a set of matrices with JSR . For all , there exists a vector norm in such that for any matrix in
Proof. (of Theorem LABEL:thm:existenceconvexpoly.) Let and denote the JSR of by . By assumption we have and by Theorem LABEL:thmrotastrang, there exists a norm, which from here on we simply denote by , such that
We denote the unit ball of this norm by and use the notation
Hence, we have
The goal is to construct a convex positive definite homogeneous polynomial of some degree , such that its 1sublevel set satisfies
As and , it would follow that
This would imply that and for By homogeneity of , we get the claim in the statement of the theorem.
To construct , we proceed in the following way. Let
To any we associate a (nonzero) dual vector orthogonal to a supporting hyperplane of at . This means that . Since the set
is a relatively open nonempty subset of the boundary of our unit ball. Moreover, Now, the family of sets is an open covering of and hence we can extract a set of points such that the union of the sets covers Let For any natural number , we define ^{5}^{5}5Note that . In fact, we have , . Indeed, there exists such that and hence
Note that is convex as the sum of even powers of linear forms and homogeneous. We first show that
As , for all and for all , we have . Hence there exists a positive integer such that
and so for all .
We now show that Let , and so This implies that
From this, we deduce that Indeed if there exists such that , which implies that and contradicts the previous statement. Hence . As both and contain the zero vector, we conclude that . Note that this guarantees positive definiteness of as is homogeneous and its 1sublevel set is bounded.
3.2 Existence of sosconvex polynomial Lyapunov functions
We now strengthen the converse result of the previous subsection by showing that asymptotically stable switched linear systems admit an sosconvex Lyapunov function. This in particular implies that such a Lyapunov function can be found with semidefinite programming.
Recall that a homogeneous polynomial is said to be positive definite (pd) if for all .
Theorem 3.4
If the switched linear system (LABEL:eq:switched.linear.system) is asymptotically stable under arbitrary switching, then there exists a homogeneous polynomial that satisfies the sum of squares constraints
Moreover, this polynomial is positive definite and is such that the polynomials are also positive definite.
Our proof will make crucial use of the following Positivstellensatz due to Scheiderer.
Theorem 3.5 (Scheiderer [47])
Given any two positive definite homogeneous polynomials and , there exists a positive integer such that is sos.
Proof. (of Theorem LABEL:thm:existencesosconvexpoly). We have already shown in the proof of Theorem LABEL:thm:existenceconvexpoly that under our assumptions, there exist vectors and a positive integer such that the convex form is positive definite and makes the forms also positive definite. Note also that as a sum of powers of linear forms, is already sosconvex and sos. Let denote the unit sphere in and define
By definition of , we have that is positive definite. Furthermore, as (this is a consequence of being positive definite) and as is sos, we get that is positive definite. Hence, from Theorem LABEL:thm:claus, there exist an integer such that
\hb@xt@.01(3.1) 
is sos and an integer such that
\hb@xt@.01(3.2) 
is sos.
Take and define It is easy to see that is positive definite as is positive definite. We first show that is sosconvex. We have
As is sos, any power of it is also sos. Furthermore, we have
which implies that there exists a polynomial matrix such that . As a consequence, we see that
is a sum of squares and hence is sosconvex.
We now show that for , the form is positive definite and sos. For positive definiteness, simply note that as for any and is nonnegative, we get for any and
To show that is sos, we make use of the following identity:
\hb@xt@.01(3.3) 
Applying (LABEL:eq:identity), we have
\hb@xt@.01(3.4) 
For any , either or . Suppose that the index is such that : by definition of , this implies that . Since the polynomial in (LABEL:eq:claus.p) sos and since is sos, we get that the term
in the sum (LABEL:eq:sum) is sos. Similarly, if the index is such that , we have that by definition of Sine the polynomial in (LABEL:eq:claus.pAj) is sos, we come to the conclusion that the term
in the sum (LABEL:eq:sum) is sos. When summing over all possible , as each term in the sum is sos, we conclude that the sum
itself is sos. Now note that we can write
which enables us to conclude that
is sos as the sum of sos polynomials.
3.3 Nonexistence of a uniform bound on the degree of convex polynomial Lyapunov functions
It is known that there are families of matrices for which the switched linear system (LABEL:eq:switched.linear.system) is asymptotically stable under arbitrary switching, but such that the minimum degree of a common polynomial Lyapunov function is arbitrarily large [3]. (In fact, this is the case already when .) In the case where admits a common polynomial Lyapunov function of degree , it is natural to ask whether one can expect to also admit a common convex polynomial Lyapunov function of some degree , where is a function of only? In this subsection, we answer this question in the negative.
Consider the set of matrices with
\hb@xt@.01(3.5) 
This is a benchmark set of matrices that has been studied in [10], [41] mainly because it provides a “worstcase” example for the method of common quadratic Lyapunov functions. Indeed, it is easy to show that , but a common quadratic Lyapunov function can only produce an upper bound of on the JSR. In [41], Parrilo and Jadbabaie give a simple degree4 (nonconvex) common polynomial Lyapunov function that proves stability of the switched linear system defined by the matices for any . In sharp contrast, we show the following:
Theorem 3.6
Let be as in (LABEL:eq:A1,A2.ando.shi) and consider the sets of matrices parameterized by a scalar . As , the minimum degree of a common convex polynomial Lyapunov function for goes to infinity.
Proof. It is sufficient to prove that the set has no convex invariant set defined as the sublevel set of a polynomial. Indeed, if there were a uniform bound on the degree of a convex polynomial Lyapunov function for the sets this would imply the existence of an invariant set—which is the sublevel set of a convex polynomial function of degree —for the set itself.
We prove our claim by contradiction. In fact, we will prove the slightly stronger fact that for these matrices, the only convex invariant set is the unit square
or, of course, a scaling of it.
Let and let denote the set of all matrix products out of . Suppose for the sake of contradiction that there was a convex bivariate polynomial whose unit level set was the boundary of an invariant set for the switched system defined by . More precisely, suppose we had
\hb@xt@.01(3.6) 
Let be such that
It is easy to check that the following matrices can be obtained as products of matrices in :
\hb@xt@.01(3.7) 
This implies that
as well, because these points can all be mapped onto each other with matrices from (LABEL:eqnandosemigroup).
Suppose that there is an such that Then we reach a contradiction because (LABEL:eqnandosemigroup) implies that can be mapped on which contradicts (LABEL:eqncontrexlyap) because This implies that . However, convexity of implies that . Thus, we have proved that The same is true for by symmetry.
In the same vein, if there is a such that this point can be mapped on which again leads to a contradiction, because Hence, which concludes the proof.
4 SOSconvex Lyapunov functions and switched nonlinear systems
In this section, we turn our attention to stability analysis of switched nonlinear systems
\hb@xt@.01(4.1)  
where are continuous and satisfy We start by demonstrating the significance of convexity of Lyapunov functions in this setting. We then consider the case where are polynomials and devise algorithms that under mild conditions find algebraic certificates of local asymptotic stability under arbitrary switching. These algorithms produce a fulldimensional inner approximation to the region of attraction of the origin, which comes in the form of a sublevel set of an sosconvex polynomial.
4.1 The significance of convexity of the Lyapunov function
The following example demonstrates that unlike the case of switched linear systems, one cannot simply resort to a common Lyapunov function for the maps to infer a proof of stability of a nonlinear difference inclusion.
Example 1
Consider the nonlinear switched system (LABEL:eq:switched.nonlinear.system.ex) with and
\hb@xt@.01(4.2)  
The function
\hb@xt@.01(4.3) 
is a common Lyapunov function for both and , but nevertheless the system in (LABEL:eq:switched.nonlinear.system.ex) is unstable.
To see this, note that
for and for all
On the other hand, (LABEL:eq:switched.nonlinear.system.ex) is unstable since in particular the dynamics with
is obviously unstable.
Note that the Lyapunov function in (LABEL:eq:Lyap.nonconvex) was not convex. Proposition LABEL:prop:convex.lyap below shows that a convexity requirement on the Lyapunov function gets around the problem that arose above. To prove this proposition, we first give a lemma which is potentially of independent interest for global stability analysis. Recall that Lyapunov’s theorem for global asymptotic stability commonly requires that the Lyapunov function be radial unbounded (i.e., satisfy ). Our lemma shows that convexity brings this property for free.^{6}^{6}6We remind the reader that radial unboundedness is not equivalent to radial unboundedness along restrictions to all lines, hence the need for the subtleties in this proof.
Lemma 4.1
Suppose a function satisfies and for all . If is convex, then it is radially unbounded.
Proof. We proceed by contradiction. Suppose that is not radially unbounded. This implies that there exists a scalar for which the sublevel set
of is unbounded. As is convex, is convex, and as any nonempty sublevel set of contains the origin, contains the origin. We claim that must in fact contain an entire ray originating from the origin.
Indeed, as is unbounded, there exists a sequence of points such that and such that for all Consider now the sequence : this is a bounded sequence and hence has a subsequence that converges. Let be the limit of this subsequence. We argue that the ray is contained in . Suppose that it was not: then for some fixed , and since is closed (as a sublevel set of a continuous function), there exists a scalar such that for all with , we have As and a subsequence of converges to , there must exist an integer such that
Note that
which implies that and hence does not belong to . But this contradicts convexity of as
and and are in .
We now consider the restriction of to this ray, which we denote by , where We remark that as a univariate function, is convex, and positive everywhere except at zero where it is equal to zero. By convexity of , we have the inequality
for all . This is equivalent to
\hb@xt@.01(4.4) 
Note that , but since is a restriction of to a ray contained in . This contradicts the inequality in (LABEL:eq:g.univariate.inequality) when is large. Hence, cannot be unbounded and it follows that must be radially unbounded.
Proposition 4.2
Consider the nonlinear switched system in (LABEL:eq:switched.nonlinear.system.ex).

If there exists a convex function that satisfies , for all and
\hb@xt@.01(4.5) then the origin is globally asymptotically stable under arbitrary switching.

If there exist a scalar and a convex function that satisfies , for all and
\hb@xt@.01(4.6) then the origin is locally asymptotically stable under arbitrary switching and the set is a subset of the region of attraction of the origin.
Proof. The proof of this proposition is similar to the standard proofs of Lyapunov’s theorem except for the parts where convexity intervenes. Hence we only prove part (i) and leave the very analogous proof of part (ii) to the reader.
Suppose the assumptions of (i) hold. Then, for all we have
\hb@xt@.01(4.7)  
where the first inequality follows from convexity of and the second from (LABEL:eq:corner.decrease) and the fact that . Hence, our Lyapunov function decreases in each iteration independent of the realization of the uncertain and timevarying map in (LABEL:eq:switched.nonlinear.system.ex).
To show that the origin is stable in the sense of Lyapunov, consider an arbitrary scalar and the ball Recall that as a consequence of Lemma LABEL:lem:convex.coercive, all sublevel sets of are bounded. Let be the radius of a ball that is contained in a (fulldimensional) sublevel set of which itself is contained in . Then, from (LABEL:eq:Lyap.decrease), we get that
To show that the origin attracts all initial conditions, consider an arbitrary nonzero point and denote by any sequence that this initial condition can generate under the iterations of (LABEL:eq:switched.nonlinear.system.ex). We know that the sequence is positive and decreasing (unless in finite time lands on the origin, in which case the proof is finished). It follows that for some scalar We claim that , in which case we must have as which is the desired statement. Suppose for the sake of contradiction that we had Then, we must have
Note that the set is closed and bounded as , being a convex function, is continuous and by Lemma LABEL:lem:convex.coercive also radially unbounded. Let denote the unit simplex in and let
We claim that . This is because of (LABEL:eq:Lyap.decrease) and the fact that the above supremum is achieved as the objective functions is continuous and the feasible set is compact. Hence, the sequence decreases in each step by at least and hence must go to . This however contradicts positivity of on .