Dependence of supertropical eigenspaces

# Dependence of supertropical eigenspaces

Adi Niv INRIA Saclay–Île-de-France and CMAP. École Polytechnique, Route de Saclay, 91128 Palaiseau Cedex, France.  and  Louis Rowen Department of Mathematics. Bar-Ilan University, Ramat-Gan 52900,Israel
###### Abstract.

We study the pathology that causes tropical eigenspaces of distinct supertropical eigenvalues of a nonsingular matrix , to be dependent. We show that in lower dimensions the eigenvectors of distinct eigenvalues are independent, as desired. The index set that differentiates between subsequent essential monomials of the characteristic polynomial, yields an eigenvalue , and corresponds to the columns of the eigenmatrix from which the eigenvectors are taken. We ascertain the cause for failure in higher dimensions, and prove that independence of the eigenvectors is recovered in case a certain “difference criterion” holds, defined in terms of disjoint differences between index sets of subsequent coefficients. We conclude by considering the eigenvectors of the matrix and the connection of the independence question to generalized eigenvectors.

The first author is sported by the French Chateaubriand grant and INRIA postdoctoral fellowship

## 1. Introduction

Although supertropical matrix algebra as developed in [20, 21] follows the general lines of classical linear algebra (i.e., a Cayley-Hamilton Theorem, correspondence between the roots of the characteristic polynomial and eigenvalues, Kramer’s rule, etc.), one encounters the anomaly in [21, Remark 5.3 and Theorem 5.6] of a matrix whose supertropical eigenvalues are distinct but whose corresponding supertropical eigenspaces are dependent. In this paper we examine how this happens, and give a criterion for the supertropical eigenspaces to be dependent, which we call the difference criterion, cf. Definition 3.1 and Theorem 3.4. A pathological example (3.3) is studied in depth to show why the difference criterion is critical. We resolve the difficulty in general in Theorem 3.11 by passing to powers of and considering generalized supertropical eigenspaces.

### 1.1. The tropical algebra and related structures

We start by discussing briefly the max-plus algebra, its refinements, and their relevance to applications.

The use of the max-plus algebra in tropical mathematics was inspired by the function , as the base of the logarithm approaches . In the literature, this structure is usually studied via valuations (see  [16] and  [17]) over the field  of Puiseux series with powers in  (resp. , to the ordered group  (resp. ). The valuation is given by the lowest exponent appearing nontrivially in the series (indeed  and ). Then, we look at the dual structure obtained by defining  and denoted as the tropicalization of . By setting  to be , it is obvious that the tropical structure deals with the uncertainty of equality in the valuation, in the form of  (also equals to ).

### 1.2. The max-plus algebra

The tropical max-plus semifield is an ordered group (usually the additive group of real numbers  or the set of rational numbers ), together with a formal element  adjoined. The ordered group is made into a semiring equipped with the operations

 a\varoplusb=max{a,b}  and  a\varodotb=a+b,

denoted here as  and  respectively (see  [1],  [14] and  [15]). The unit element is really the element , and serves as the zero element.

Tropicalization enables one to simplify non-linear questions by putting them into a linear setting (see [13]), which can be applied to discrete mathematics (see  [4]), optimization (see  [10]) and algebraic geometry (see  [14]).

In  [12] Gaubert and Sharify introduce a general scaling technique, based on tropical algebra, which applies in particular to the companion form, determining the eigenvalues of a matrix polynomial. Akian, Gaubert and Guterman show in  [3] that several decision problems originating from max-plus or tropical convexity are equivalent to zero-sum two player game problems.

[25] is a collection of papers put together by Litvinov and Sergeev. One main theme is the Maslov dequantization applied to traditional mathematics over fields, built on the foundations of idempotent analysis, tropical algebra, and tropical geometry. Applications of idempotent mathematics were introduced by Litvinov and Maslov in  [24].

On the side of pure mathematics, contributions are made in  [25] to idempotent analysis, tropical algebras, tropical linear algebra and tropical convex geometry. Elaborate geometric background with applications to problems in classical (real and complex) geometry can be found in  [26]. Here Mikhalkin viewed the tropical structure as a branch of geometry manipulating with certain piecewise-linear objects that take over the role of classical algebraic varieties and describes hypersurfaces, varieties, morphisms and moduli spaces in this setting.

Extensive mathematical applications have been made in combinatorics. In this max-plus language, we may use notions of linear algebra to interpret combinatorial problems. In  [23] Jonczy presents some problems described by the Path algebra and solved by means of and operations. Combinatorial overviews are given in  [7],  [8] of Butkovic and  [9] of Butkovic and Murfitt, which focus on presenting a number of links between basic max-algebraic problems on the one hand and combinatorial problems on the other hand. This indicates that the max-algebra may be regarded as a linear-algebraic encoding of a class of combinatorial problems.

### 1.3. Supertropical algebra

We pass to the supertropical semiring, equipped with the ghost ideal , as established and studied by Izhakian and Rowen in  [18] and  [19].

We denote as  the “standard” supertropical semiring, which contains the so-called tangible elements of the structure and where we have a projection given by for (and which is the identity map on ). are the ghost elements of the structure, as defined in [19]. We write for , to stress its role as the zero element. On the one hand, is a copy of the max-plus semifield, so can be viewed as a cover of the max-plus semifield.

The supertropical semiring enables us to distinguish between a maximal element  that is attained only once in a sum, i.e.,  which is invertible, and a maximum that is being attained at least twice, i.e., , which is not invertible. We do not distinguish between  and in this structure. Note that projects the standard supertropical semiring onto , which can be identified with the usual tropical structure.

In this new supertropical sense, we use the following order relation to describe two elements that are equal up to a ghost supplement:

###### Definition 1.1.

Let be any two elements in . We say that ghost surpasses , denoted , if . That is, or with .

We say is -equivalent to , denoted by , if . That is, in the tropical structure, -equivalence projects to equality.

Important properties of :

1. is a partial order relation (see  [21, Lemma 1.5]).

2. If then .

3. If and then and .

4. If and , then .

Considering this relation, we regain basic algebraic properties that were not accessible in the usual tropical setting, such as multiplicativity of the tropical determinant, the near multiplicativity of the tropical adjoint, the role of roots in the factorization of polynomials, the role of the determinant in matrix singularity, a matrix that acts like an inverse, common behavior of similar matrices, classical properties of and the use of elementary matrices. Tropical eigenspaces and their dependences are of considerable interest, as one can see in  [2],  [5],  [7],  [18],  [21] and  [29].

Many of these properties will be formulated in the Preliminaries section. We would also like to attain a supertropical analog to the classical eigenspace decomposition, (i.e., eigenvectors corresponding to distinct eigenvalues are linearly independent, and the generalized eigenvectors generate ), but we encounter the example of [21, Example 5.7] where the eigenvectors of distinct eigenvalues are supertropically dependent, extensively studied in Section 3.2. Our objective in this paper is to understand how such an example arises, and how it can be circumvented, either by introducing the difference criterion of Definition 3.1 or by passing to generalized eigenspaces in § 3.3.3.

## 2. Preliminaries

In this section, we present well-known and recent results of tropical polynomials. Then we introduce properties of matrices and vectors in the tropical structure, with definitions extended to the supertropical framework.

### 2.1. Tropical Polynomials

###### Notation 2.1.

Throughout, for each element , we choose an element  (We define so )

Likewise, for , denotes and denotes . The same holds for matrices and for polynomials (according to their coefficients).

###### Definition 2.2.

Let . Defining to be the tropical product of by itself times (i.e., ), we may consider that is a -root of , denoted as . This operation is well-defined on .

Clearly, any tropical polynomial takes the value of the dominant monomial along the -axis. That having been said, it is possible that some monomials in the polynomial would not dominate for any .

###### Definition 2.3.

Let be a tropical polynomial. We call monomials in that dominate for some  essential, and monomials in that do not dominate for any inessential. We write where is an essential monomial , called the essential polynomial of .

In the classical sense, a root of a tropical polynomial can only be , which occurs if and only if the polynomial has constant term . We would like the roots to indicate the factorization of the polynomial, which leads to the following tropical definition of a root.

###### Definition 2.4.

We define an element to be a root of a tropical polynomial if  i.e., is a ghost.

We refer to roots of a polynomial being obtained as a simultaneous value of two leading tangible monomials as corner roots, and to roots that are being obtained from one leading ghost monomial as non-corner roots. We factor polynomials viewing them as functions. Then, for every corner root of , we may write as for some and , where is the difference between the exponents of the tangible essential monomials attaining .

### 2.2. Matrices

As defined over a ring, for matrices

 ⎧⎪⎨⎪⎩A+B=(ci,j): ci,j=ai,j+bi,j,defined iff n=s,m=t ,  AB=(di,j) : di,j=∑k∈[n]ai,kbk,j,defined iff  m=s .
###### Definition 2.5.

Let and . The permutation of is the word

 a1,π(1)a2,π(2)⋯an,π(n).

The word is denoted as the identity or Id-permutation, corresponding to the diagonal of . We write a permutation of as a product of disjoint cycles , where  corresponds to the disjoint cycles composing .

We define the tropical trace and determinant of to be

 tr(A)=∑k∈[n]ak,k   and   det(A)=∑σ∈Sna1,σ(1)⋯an,σ(n),

respectively.

In the special case where , we refer to any entry attaining the trace as a dominant diagonal entry. We call the weight contributed by to the determinant, and any permutation whose weight has the same -value as the determinant is a dominant permutation of .

If there is a single dominant permutation, its weight equals the determinant.

Unlike over a field, the tropical concepts of singularity, invertability and factorizability do not coincide. We would like the determinant to indicate the singularity of a matrix. Hence, we define a matrix  to be tropically singular if there exist at least two different dominant permutations. Otherwise the matrix is tropically nonsingular. Consequently, a matrix  is supertropically singular if  and supertropically nonsingular if . A matrix  is strictly singular if .

A surprising result in this context is that the product of two nonsingular matrices might be singular, but we do have:

###### Theorem 2.6.

For matrices over the supertropical semiring , we have

 det(AB)⊨gsdet(A)det(B).

This theorem has been proved in  [20, Theorem 3.5] due to considerations of graph theory, but also in  [11, Proposition 2.1.7] by using the transfer principles (see  [2, Theorem 3.3 and Theorem 3.4 ]). These theorems allow one to obtain such results automatically in a wider class of semirings, including the supertropical semiring.

###### Definition 2.7.

Suppose is a semiring. An -module is a semigroup together with scalar multiplication satisfying the following properties for all and :

1. .

For any semiring , let be the free module of rank over . We define the standard base to be , where

 ei={1T=1R,in the ith % coordinate0T=0R,otherwise.

The tropical identity matrix is the  matrix with the standard base for its columns. We denote this matrix as

A matrix is invertible if there exists a matrix  such that

From now on , where its set is presumed to be a group, and is its ghost elements. We write with the standard base .

###### Definition 2.8.

We define vectors in to be (supertropically) dependent if there exist  such that . Otherwise, this set of tropical vectors is called independent.

We say that subspaces of , are (supertropically) dependent, if there are tangible  which are (supertropically) dependent.

By [20, Theorem 6.5], vectors are dependent iff , where is the matrix having   for its columns.

We define two types of special matrices:

###### Definition 2.9.

An matrix is a permutation matrix if there exists such that

 pi,j={0F,j≠π(i)1F,j=π(i).

Since and is invertible, a permutation matrix is always invertible.

An matrix  is a diagonal matrix if

 ∃ a1,…,an∈F: di,j={0F,j≠iai,j=i,

which is invertible if and only if is invertible (i.e., ).

###### Remark 2.10.

(See  [20, Proposition 3.9]) A tropical matrix is invertible if and only if it is a product of a permutation matrix and an invertible diagonal matrix. These types of products are called generalized permutation matrices, that is such that

We define three types of tropical elementary matrices, corresponding to the three elementary matrix operations, obtained by applying one such operation to the identity matrix.

A transposition matrix is obtained from the identity matrix by switching two rows (resp. columns). This matrix is invertible: and a product of transposition matrices yields a permutation matrix.

An elementary diagonal multiplier is obtained from the identity matrix where one row (resp. column) has been multiplied by an invertible scalar. This matrix is invertible: and a product of diagonal multipliers yields an invertible diagonal matrix.

A Gaussian matrix is defined to differ from the identity matrix by having a non-zero entry in a non-diagonal position. We denote as the elementary Gaussian matrix adding row , multiplied by , to row . By Remark 2.10, this matrix is not invertible.

###### Definition 2.11.

A nonsingular matrix is defined as definite if

 det(A)=0=ai,i, ∀i.

#### 2.2.1. The supertropical approach

Having established that algebraically and  effectively take the role of singularity and equality over , we would like to extend additional definitions to the supertropical setting, using ghosts for zero.

A quasi-zero matrix is a matrix equal to on the diagonal, and whose off-diagonal entries are ghost or .

A diagonally dominant matrix is a nonsingular matrix with a dominant permutation along the diagonal.

A quasi diagonally dominant matrix is a diagonally dominant matrix whose off-diagonal entries are ghost or .

A quasi-identity matrix is a nonsingular, multiplicatively idempotent matrix equal to , where is a quasi-zero matrix.

Thus, every quasi-identity matrix is quasi diagonally dominant. Using the tropical determinant, we attain the tropical analog for the well-known adjoint.

###### Definition 2.12.

The -minor of a matrix is obtained by deleting row  and column of . The adjoint matrix of is defined as the matrix , where . When is invertible, the matrix denotes

Notice that may be obtained as the sum of all permutations in passing through , but with deleted:

 det(Aj,i)=∑σ∈Sn:σ(j)=ia1,σ(1)⋯aj−1,σ(j−1)aj+1,σ(j+1)⋯an,σ(n).

When writing each permutation as the product of disjoint cycles, can be presented as:

 det(Aj,i)=∑σ∈Sn:σ(j)=i(ai,σ(i)aσ(i),σ2(i)⋯aσ−1(j),j)Cσ,

where  is the product of the remaining cycles.

###### Definition 2.13.

We say that is the quasi-inverse of over , denoting

 IA=AA∇ and I′A=A∇A,

where are quasi-identities (see  [21, Theorem 2.8]).

These supertropical definitions provide a tropical version for two well-known algebraic properties, proved in Proposition 4.8. and Theorem 4.9. of  [20].

###### Proposition 2.14.

.

As a result, one concludes from the fourth property of  (see Definition 1.1) and Theorem 2.6 that , when  is nonsingular.

1. .

2. .

###### Remark 2.16.

(see  [28, Remark 2.18]) For a definite matrix we have

which is also definite.

The following lemma has been proved in  [28, Lemma 3.2], and states the connection between multiplicity of the determinant and the quasi-inverse matrix:

###### Lemma 2.17.

Let be an invertible matrix and be nonsingular.

1. .

2. .

3. .

4. If , where  is the definite form of  with left normalizer , then  where  is definite, with right normalizer .

Matrix invariants

Let . We continue the supertropical approach by defining , not all singular, such that to be a supertropical eigenvector of with a supertropical eigenvalue , having an eigenmatrix . The eigenspace is the set of eigenvectors with eigenvalue .

The characteristic polynomial of (also called the maxpolynomial, cf.[8]) is defined to be

 fA(x)=det(xI+A).

The tangible value of its roots are the eigenvalues of , as shown in [20, Theorem 7.10]. Following to Definition 2.4, we may have corner eigenvalues and non-corner eigenvalues.

The coefficient of in this polynomial is the sum of determinants of all principal sub-matrices, otherwise known as the trace of the compound matrix of . Thus, this coefficient, which we denote as , takes the dominant value among the permutations on all subsets of indices of size :

 αk=∑I⊆[n]:|I|=k∑σ∈Sk∏i∈Iai,σ(i).

When we define the index set of , denoted by , a set on which the dominant permutation defining is obtained.

Let be the characteristic polynomial of , with the essential polynomial

 fesA(x)=∑kαikxn−ik.

Let be the corner eigenvalue obtained between the essential monomial and the subsequent essential monomial . We denote

###### Theorem 2.18.

(The eigenvectors algorithm, see  [21, Remark 5.3 and Theorem 5.6].) Let The tangible value of the -column of  (see Notation 2.1), is a tropical eigenvector of with respect to the eigenvalue .

This algorithm will be demonstrated in §3.2.

The Supertropical Cayley-Hamilton Theorem has been proved in  [20, Theorem 5.2], and is as follows:

###### Theorem 2.19.

Any matrix satisfies its tangible characteristic polynomial , in the sense that  is ghost.

One can find a combinatorial proof in  [30] and a proof using the transfer principle in  [2].

In analogy to the classical theory, we have

###### Proposition 2.20.

([20, Proposition 7.7]) The roots of the polynomial are precisely the supertropical eigenvalues of .

###### Remark 2.21.

Recall that a supertropical polynomial is -primary if it has the unique supertropical root . It is well-known that any tropical -primary polynomial has the form for some , and any tropical essential polynomial can be factored as a function to a product of primary polynomials, and thus of the form where The supertropical version of this is given in [19, Theorem 8.25 and Theorem 8.35].

Another classical property attained in this extended structure is:

###### Proposition 2.22.

If is a supertropical eigenvalue of a matrix with eigenvector , then is a supertropical eigenvalue of , for every , with respect to the same eigenvector.

###### Theorem 2.23.

Let be a nonsingular matrix.

1. [27, Theorem 3.6]) For any we have

 fAm(xm)⊨gs(fA(x))m,

implying that the -root of every corner eigenvalue of  is a corner eigenvalue of .

2. [6, Theorem 4.1]) For , the quasi-inverse of , we have

 det(A)fA∇(x)⊨gsxnfA(x−1),

implying that the inverse of every corner eigenvalue of   is a corner eigenvalue of .

## 3. Dependence of eigenvectors

A well-known decomposition of , where is a field, is the decomposition to eigenspaces of a matrix . In particular, this decomposition is obtained when the eigenvalues are distinct since, in the classical case, eigenspaces of distinct eigenvalues are linearly independent, which compose a basis for . In the tropical case, considering that dependence occurs when a tropical linear combination ghost-surpasses , such a property need not necessarily hold.

In the upcoming section we analyze the dependence between eigenvectors, using their definition according to the algorithm described in Theorem 2.18. We present special cases in which this undesired dependence is resolved.

###### Definition 3.1.

The matrix satisfies the difference criterion if the sets such that is a corner root of , are disjoint.

### 3.1. Eigenspaces in lower dimensions

the In the following proposition, we verify independence of eigenvectors having distinct eigenvalues, for dimensions .

###### Proposition 3.2.

Let be a nonsingular matrix, where , with a tangible characteristic polynomial (coefficient-wise) and distinct eigenvalues. Then the eigenvectors of are tropically independent.

###### Proof.

The case:

Let be the characteristic polynomial of . If has two distinct eigenvalues, then these must be and .

We must have for otherwise either

 fA(λ2)=det(A)tr(A)(det(A)tr(A)+tr(A)ν)=(det(A)tr(A))2∈T,

or , which means the polynomial has one root with multiplicity .

Without loss of generality, we may assume that . According to the algorithm, since , has the eigenvector obtained by the tangible value of the first column of its eigenmatrix. Since , has the eigenvector obtained by the tangible value of the second column of its eigenmatrix.

The determinant is either:

 det(A)=a1,1a2,2, where a1,1>a2,2 and a1,1a2,2>a1,2a2,1,

(and then the eigenvalues are  and ) or

 det(A)=a1,2a2,1, where a1,1a2,2

(and then the eigenvalues are and satisfying ).

In both cases, the first column of is and the second column of  is , which are tropically independent since .

The case:

This case indicates key techniques for understanding and motivating the general proof on matrices satisfying the difference criterion in §3.3.1.

Let  be the characteristic polynomial of , recalling that is the sum of the determinants of all of the principle sub-matrices. We assign  to be , i.e.,

 (3.1) a1,1>at,t ∀t≠1.

For the determinant we have six permutations of . In order to obtain three distinct eigenvalues, we must have

 (3.2) λ1=tr(A)>λ2=αtr(A)>λ3=det(A)α,

for otherwise or . Thus

 (3.3) λ1λ2=α  and  λ1λ2λ3=det(A).

As a result, ; otherwise, together with yields a permutation whose weight is dominated by , and we get contrary to (3.2).

Therefore,

 ⎧⎪⎨⎪⎩Iλ1={1}∖∅={1}Iλ2={1,j}∖{1}={j},Iλ3={1,j,k}∖{1,j}={k},

where are distinct. Without loss of generality, we may take and , and obtain the eigenmatrices:

because  is a summand of , , and

where , since  is a summand in .

Recalling the algorithm in Theorem 2.18, we let be the matrix with the (tangible value of the) eigenvectors for its columns

 W=⎛⎜ ⎜ ⎜ ⎜ ⎜ ⎜⎝λ21a1,2λ2+a1,3a3,2a1,3β+a1,2a2,3a2,1λ1+a2,3a3,1λ1λ2a2,3λ1+a2,1a1,3a3,1λ1+a3,2a2,1a3,2λ1+a3,1a1,2λ1β+a1,2a2,1⎞⎟ ⎟ ⎟ ⎟ ⎟ ⎟⎠.

We get , since

Due to relations (3.1)-(3.3), all non-identity permutations in :

are strictly dominated by .

We further study this property in the generalization proved in Theorem 3.4. The cases in Step 3 of its proof are demonstrated above.

### 3.2. The pathology appears

We follow Example 3.3, introduced in  [21], to show how independence of eigenspaces might fail for dimensions higher then , due to the increased variety of indices. While applying the eigenvectors-algorithm, we utilize a supertropical analog of classical Gaussian elimination, treating the ghosts as “zero-elements”. This illustrative example will provide the motivation for Theorem 3.4, Conjecture 3.5 and Conjecture 3.6, generalizing the connection of the index sets to the dependence of the eigenvectors.

###### Example 3.3.

Let

 A=⎛⎜ ⎜ ⎜⎝10109−91−−−−−99−−−⎞⎟ ⎟ ⎟⎠.

The characteristic polynomial of is

 fA(x)=x4+10x3+19x2+27x+28,

obtained from the permutations respectively. Therefore,

 (3.4) ⎧⎪ ⎪ ⎪⎨⎪ ⎪ ⎪⎩Iλ1={1}∖∅={1},Iλ2={1,2}∖{1}={2},Iλ3={1,3,4}∖{1,2}={3,4},Iλ4={1,2,3,4}∖{1,3,4}={2}

where are the eigenvalues of . As we saw in §3.1, the overlap of the second and fourth sets cannot occur in lower dimensions.

The eigenmatrices and eigenvectors are as follows:

For

 A+10I=⎛⎜ ⎜ ⎜⎝10ν109−910−−−−1099−−10⎞⎟ ⎟ ⎟⎠,

and the tangible value of the first column of its adjoint is

 v1=(30,29,28,29)=28(2,1,0,1).

This can also be obtained when multiplying the eigenmatrix by

 E24th row+1⋅3rd rowE4th row+1⋅2nd rowE2nd row+1st rowE1,4

on the left:

 ⎛⎜ ⎜ ⎜⎝9−−109ν10−10−−10910ν10ν12ν11ν⎞⎟ ⎟ ⎟⎠,

and solving the tropically linear system

 ⎧⎨⎩9x+10w∈G,10y+10w∈G,10z+9w∈G,

which yields a multiple of .

For

 A+9I=⎛⎜ ⎜ ⎜⎝10109−99−−−−999−−9⎞⎟ ⎟ ⎟⎠,

and the tangible value of the second column of its adjoint is

 v2=(28,28,28,28)=28(0,0,0,0).

This can also be obtained when multiplying the eigenmatrix by

 E4th row+2⋅3rd rowE4th row+1⋅2nd rowE2nd row+1st rowE1,4

on the left:

 ⎛⎜ ⎜ ⎜⎝9−−99ν9−9−−10910ν10ν9ν9ν⎞⎟ ⎟ ⎟⎠,

and solving the tropically linear system

 ⎧⎨⎩9x+9w∈G,9y+9w∈G,9z+9w∈G,

which yields a multiple of .

For

 A+8I=⎛⎜ ⎜ ⎜⎝10109−98−−−−899−−8⎞⎟ ⎟ ⎟⎠,

and the tangible value of the third column of its adjoint is

 v3=(25,26,27,26)=25(0,1,2,1).

This can also be obtained when multiplying the eigenmatrix by

 E4th row+1⋅3rd rowE4th row+2⋅2nd rowE2nd row+1st rowE1,4

on the left:

 ⎛⎜ ⎜ ⎜⎝9−−89ν8−8−−8911ν10ν9ν10ν⎞⎟ ⎟ ⎟⎠,

and solving the tropically linear system

 ⎧⎨⎩9x+8w∈G,8y+8w∈G,8z+9w∈G,

which yields