Fast algorithm for border bases of Artinian Gorenstein algebras
Given a multi-index sequence , we present a new efficient algorithm to compute generators of the linear recurrence relations between the terms of . We transform this problem into an algebraic one, by identifying multi-index sequences, multivariate formal power series and linear functionals on the ring of multivariate polynomials. In this setting, the recurrence relations are the elements of the kernel of the Hankel operator associated to . We describe the correspondence between multi-index sequences with a Hankel operator of finite rank and Artinian Gorenstein Algebras. We show how the algebraic structure of the Artinian Gorenstein algebra associated to the sequence yields the structure of the terms for all . This structure is explicitly given by a border basis of , which is presented as a quotient of the polynomial ring by the kernel of the Hankel operator . The algorithm provides generators of constituting a border basis, pairwise orthogonal bases of and the tables of multiplication by the variables in these bases. It is an extension of Berlekamp-Massey-Sakata (BMS) algorithm, with improved complexity bounds. We present applications of the method to different problems such as the decomposition of functions into weighted sums of exponential functions, sparse interpolation, fast decoding of algebraic codes, computing the vanishing ideal of points, and tensor decomposition. Some benchmarks illustrate the practical behavior of the algorithm.
Fast algorithm for border bases of Artinian Gorenstein algebras
and BERNARD MOURRAIN††thanks: UCA, Inria Méditerranée, aromath, Sophia Antipolis, France, firstname.lastname@example.org
Discovering hidden structures from probing or sampling is a problem which appears in many contexts and in many applications. An interesting instance of this general problem is recovering the structure of a sequence of values, from the knowledge of some of its terms. It consists in guessing any term of the sequence from the first known terms. A classical way to tackle this problem, which goes back to Bernoulli, is to find linear recurrence relations between the first terms of a sequence, to compute the roots of the associated characteristic polynomial and to deduce the expression of any term of the sequence from these roots.
In this paper, we consider the structure discovering problem for multi-index sequences of values in a field . Given a finite set of values for , we want to guess a formula for the general terms of the sequence . An important step of this approach is to compute characteristic polynomials of the sequence . They correspond to multi-index recurrence relations with constant coefficients between the terms of . The ideal of these recurrence relation polynomials define an Artinian Gorenstein algebra. We present a fast algorithm to compute a border basis of this ideal from the first terms of the sequence . This method also yields a basis of the Artinian Gorenstein algebra as well as its multiplicative structure.
The approach that we present is related to Prony’s method in the univariate case and to its variants , , , , … and to the more recent extensions in the multivariate case , , , . Linear algebra tools are used to determine a basis of the quotient algebra or to compute an -basis for the presentation of . An analysis of the complexity of these approaches yields bounds in (or ), where is the size of the Hankel matrices involved in these methods, typically the number of monomials in degree at most if the terms of the sequence are known up to the degree . The problem is also related to Padé approximants, well investigated in the univariate case , , , but much less developed in the multivariate case , .
Finding recurrence relations is a problem which is also well developed in the univariate case. Berlekamp  and Massey  proposed an efficient algorithm to compute such recurrence relations, with a complexity in where is the size of the minimal recurrence relation. Exploiting further the properties of Hankel matrices, the complexity of computing recurrence relations can be reduced in the univariate case to .
Sakata extended Berlekamp-Massey algorithm to multi-index sequences, computing a Gröbner basis of the polynomials in the kernel of a multi-index Hankel matrix . See also  for an analysis and overview of the algorithm. The computation of multivariate linear recurrence relations have been further investigated, e.g. in  and more recently in , where the coefficients of the Gröbner basis are computed by solving multi-index Hankel systems.
We translate the structure discovering problem into an algebraic setting, by identifying multi-index sequences of values, generating formal power series and linear functionals on the ring of polynomials. Through this identification, we associate to a multi-index sequence , a Hankel operator which kernel defines an Artinian Gorenstein Algebra when is of finite rank. We present a new efficient algorithm to compute the algebraic structure of , using the first terms for . The structure is described by a border basis of the ideal .
This algorithm is an extension of the Berlekamp-Massey-Sakata (BMS) algorithm. It computes border bases of the recurrence relations, which are more general than Gröbner bases. They also offer a better numerical stability  in the solving steps required to address the decomposition problem. The algorithm, based on a Gram-Schmidt orthogonalisation process, is simplified. The complexity bound also improves the previously known bounds for computing such recurrence relations. We show that the arithmetic complexity of computing a border basis is in where is the number of roots of (counted with multiplicities), is the size of the border of the monomial basis and is the number of known terms of the sequence .
The algorithm outputs generators of the recurrence relations, a monomial basis, an orthogonal basis and the tables of multiplication by the variables in this basis of . The structure of the terms of the sequence can be deduced from this output, by applying classical techniques for solving polynomial systems from tables of multiplication. We show how the algorithm can be applied to different problems such as the decomposition of functions into weighted sums of exponential functions, sparse interpolation, fast decoding of algebraic codes, vanishing ideal of points, and tensor decomposition.
Let be a field, its algebraic closure, be the ring of polynomials in the variables with coefficients in the field , be the ring of formal power series in the variables with coefficients in . We denote by the set of sequences of numbers , indexed by . , . The monomials in are the elements of the form for . For a set , , . A set of monomials of is connected to , if and for different from , there exists and such that . For , is the vector space of spanned by and is the ideal generated by . For , is the set of products of an element of by an element of .
2 Polynomial-Exponential series
In this section, we recall the correspondence between sequences associated to polynomial-exponential series and Artinian Gorenstein Algebras.
A sequence is naturally associated to a linear form operating on polynomials, that is, an element of , as follows:
This correspondence is bijective since a linear form is uniquely defined by the sequence for . The coefficients for are also called the moments of . Hereafter, we will identify with .
The dual space has a natural structure of -module, defined as follows: ,
For any , the inner product associated to on is defined as follows:
Sequences in are also in correspondence with series in , via the so-called -transform:
If is a field of characteristic , we can identify the sequence with the series Using this identification, we have , .
Through these identifications, the dual basis of the monomial basis is in and in .
Among the elements of , we have the evaluation at a point , which corresponds to the sequence or to the series , or to the series in . These series belong to the more general family of polynomial-exponential series with . This set corresponds in to the set of series of the form
with and finite.
For a subset , the inverse system generated by is the vector space spanned by the elements for , , that is, by the elements in and all their derivatives.
For , we denote by the dimension of the inverse system of , generated by and all its derivatives. For , .
2.2 Hankel operators
The external product allows us to define a Hankel operator as a multiplication operator by a dual element :
The Hankel operator associated to an element is
Its kernel is denoted . We say that the series has finite rank if .
As , , we easily check that is an ideal of and that is an algebra.
Given a sequence , the kernel of is the set of polynomials such that for all . This kernel is the set of linear recurrence relations of the sequence .
The matrix of the operator in the bases and its dual basis is
The coefficients of depend only the sum of the multi-indices indexing the rows and columns, which explains why it is called a Hankel operator.
In the reconstruction problem, we are dealing with truncated series with known coefficients for in a subset of . This leads to the definition of truncated Hankel operators.
For two vector spaces and , the truncated Hankel operator on , denoted by , is the following map:
If (resp. ) is a basis of (resp. ), then the matrix of the operator in and the dual basis of is
If and are monomial sets, we obtain the so-called truncated moment matrix of :
(identifying a monomial with its exponent ). These structured matrices share with the classical univariate Hankel matrices many interesting properties (see e.g. in ).
2.3 Artinian Gorenstein algebra
A -algebra is Artinian if . It can be represented as the quotient of a polynomial ring by a (zero-dimension) ideal .
A classical result states that the quotient algebra is finite dimensional, i.e. Artinian, iff is finite, that is, defines a finite number of (isolated) points in (see e.g. [Theorem 6] or [Theorem 4.3]).
The dual of is naturally identified with the sub-space
A Gorenstein algebra is defined as follows:
A -algebra is Gorenstein if such that with and implies .
In other words, is Gorenstein iff and implies . Equivalently, is Gorenstein iff there exists such that we have the exact sequence:
so that induces an isomorphism between and . In other words, a Gorenstein algebra is the quotient of a polynomial ring by the kernel of a Hankel operator, or equivalently by an ideal of recurrence relations of a multi-index sequence.
An Artinian Gorenstein can thus be described by an element , such that is finite. In the following, we will assume that the Artinian Gorenstein algebra is given by such an element . The corresponding algebra will be where .
By a multivariate generalization of Kronecker’s theorem [Theorem 3.1], the sequences such that are the polynomial-exponential series with .
The aim of the method we are presenting, is to compute the structure of the Artinian Gorenstein algebra from the first terms of the sequence . We are going to determine bases of and generators of the ideal , from which we can deduce directly the multiplicative structure of .
The following lemma gives a simple way to test the linear independency in using truncated Hankel matrices (see [Lemma 3.3]):
Let , , . If the matrix is invertible, then (resp. ) is linearly independent in .
This lemma implies that if , and is invertible, then and are bases of .
Given a Hankel operator of finite rank , it is clear that the truncated operators will have at most rank . We are going to use the so-called flat extension property, which gives conditions under which a truncated Hankel operator of rank can be extended to a Hankel operator of the same rank (see  and extensions , , ).
Let be vector spaces connected to , such that and let . Let , such that . If , then there is a unique extension such that coincides with on and . In this case, with and .
2.4 Border bases
We recall briefly the definition of border basis and the main properties, that we will need. Let be a monomial set of .
A rewriting family for a (monomial) set is a set of polynomials such that with , , if . The rewriting family is complete if .
The monomial is called the leading monomial of and denoted
A family is a border basis with respect to if it is a complete rewriting family for such that .
This means that any element of can be projected along the ideal onto a unique element of . In other words, is a basis of the quotient algebra .
Let and for , . If , then for any , there exist , such that .
For a complete rewriting family with respect to a monomial set containing , a projection of on can be defined recursively on the set of monomials of by
if , ;
if , where is the (unique) polynomial in for which ,
if for , there exists and such that . Let .
This map defines a projector from onto . The kernel of is contained in the ideal . The family is a border basis iff .
Checking that a complete rewriting family is a border basis reduces to checking commutation properties. This leads to efficient algorithms to compute a border basis. For more details, see , , .
A special case of border basis is when the leading term of is the maximal monomial of for a monomial ordering . Then is a Gröbner basis of for this monomial ordering .
A border basis with respect to a monomial set gives directly the tables of multiplication by the variables in the basis . For a monomial , with such that if and otherwise.
3 Border bases of series
Given the first terms for of the sequence , where is a finite set of exponents, we are going to compute a basis of and generators of . We assume that the monomial set is connected to 1.
3.1 Orthogonal bases of
An important step in the decomposition method consists in computing a basis of . In this section, we describe how to compute a monomial basis and two other bases and , which are pairwise orthogonal for the inner product :
To compute these pairwise orthogonal bases, we will use a projection process, similar to Gram-Schmidt orthogonalization process. The main difference is that we compute pairs of orthogonal polynomials. As the inner product may be isotropic, the two polynomials may not be equal, up to a scalar. For a polynomial and two families of polynomials , , we will use the following procedure .
Algorithm LABEL:algo:proj corresponds to the Modified Gram-Schmidt algorithm, when the scalar product is definite positive. It is known to have a better numerical behavior than the direct Gram-Schmidt orthogonalization process [Lecture 8]. It computes the polynomial characterized by the following lemma.
If if and , there is a unique polynomial such that with and for .
We prove by induction on the index of the loop that is orthogonal to . For , is such that .
If the property is true at step , i.e. for , then is such that by induction hypothesis. By construction, and the induction hypothesis is true for . As the matrix is invertible, there exists a unique polynomial of the form , such that for , which concludes the proof of the lemma.
Algorithm LABEL:algo:1 for computing a border basis of proceeds inductively, starting from , extending the basis with a new polynomial , orthogonal to the vector space spanned by for the inner product , extending with a new monomial , such that and for and extending with .
The main difference with Algorithm 4.1 in  is the projection procedure and the list of monomials , used to generate new monomials and to perform the projections. The lists are lists of exponents, identified with monomials.
We verify that at each loop of the algorithm, the lists , and are disjoint and . We also verify that are monomials up to a scalar, that the set of their exponents is , that and are disjoint and that .
The algorithm uses the function , which computes the set of monomials in , which are not in and such .
We denote by the order induced by the treatment of the monomials of in the loops of the algorithm, so that the monomials treated at the loop are smaller than the monomials in at the loop. For , we denote by the list of monomial exponents with and by the vector space spanned by these monomials. For , let .
The following properties are also satisfied during this algorithm:
For , we have , and . For , for all such that .
is orthogonal to for . We consider two exclusive cases: and .
If , then there exists such that . Thus is such that . By construction, for .
If , then there is no such that and . Thus is orthogonal to for all with . By construction, is orthogonal to for . As , for all such that .
This concludes the proof of this lemma.
For , and the bases are pairwise orthogonal.
We prove it by induction on . If is not in b, then for all , and are empty and the property is satisfied. If is in , then and is such that . The property is true for .
Suppose that it is true for all . By construction, the polynomial is orthogonal to for . By induction hypothesis, are pairwise orthogonal, thus
By the induction hypothesis, we deduce that
By Lemma LABEL:lem:semiortho, is orthogonal to for and thus to for . We deduce that
This shows that and are pairwise orthogonal and concludes the proof by induction.
At the loop of the algorithm, the polynomials for are of the form