Variance of the volume of random real algebraic submanifolds
Let be a complex projective manifold of dimension defined over the reals and let denote its real locus. We study the vanishing locus in of a random real holomorphic section of , where is an ample line bundle and is a rank Hermitian bundle. When , we obtain an asymptotic of order , as goes to infinity, for the variance of the linear statistics associated with , including its volume. Given an open set , we show that the probability that does not intersect is a of when goes to infinity. When , we also prove almost sure convergence for the linear statistics associated with a random sequence of sections of increasing degree. Our framework contains the case of random real algebraic submanifolds of obtained as the common zero set of independent Kostlan–Shub–Smale polynomials.
Keywords: Random submanifolds, Kac–Rice formula, Linear statistics, Kostlan–Shub–Smale polynomials, Bergman kernel, Real projective manifold.
Mathematics Subject Classification 2010: 14P99, 32A25, 53C40, 60G57, 60G60.
Framework. Let us first describe our framework and state the main results of this article (see Section 2 for more details). Let be a smooth complex projective manifold of positive complex dimension . Let be an ample holomorphic line bundle over and let be a rank holomorphic vector bundle over , with . We assume that , and are endowed with compatible real structures and that the real locus of is not empty. Let and denote Hermitian metrics on and respectively that are compatible with the real structures. We assume that has positive curvature . Then is a Kähler form on and it induces a Riemannian metric on .
For any , the Kähler form , and induce a -inner product on the space of real holomorphic sections of (see (2.1)). Let and , we denote by the real zero set of . For large enough, for almost every , is a codimension smooth submanifold of and we denote by the Riemannian measure on induced by (see Sect. 2.2). In the sequel, we will consider as a positive Radon measure on . Let us also denote by the Riemannian measure on .
Let be a standard Gaussian vector in . Then is a random positive Radon measure on . We set and to avoid too many subscripts. In a previous paper [16, thm. 1.3], we computed the asymptotic of the expected Riemannian volume of as . Namely, we proved that:
where is the volume of for and the volumes of spheres are Euclidean volumes. Here and in throughout this paper, denotes the expectation of the random variable between the brackets, and stands for the unit Euclidean sphere of dimension .
Let , we denote by its norm sup. Besides, we denote by the duality pairing between and its topological dual. Then, (1.1) can be restated as:
where stands for the unit constant function on . The same proof gives similar asymptotics for for any continuous (see [16, section 5.3]).
Let be a complex projective manifold of positive dimension defined over the reals, we assume that its real locus is non-empty. Let be a rank Hermitian vector bundle and let be a positive Hermitian line bundle, both equipped with compatible real structures. For every , let be a standard Gaussian vector in . Then the following holds as :
Moreover the error term does not depend on .
In particular, we can define a sequence of Radon measures on by: for every and every , . Then Thm. 1.1 implies that:
as continuous linear functionals on .
Statement of the results. The main result of this paper is an asymptotic for the covariances of the linear statistics . Before we can state our theorem, we need to introduce some additional notations.
As usual, we denote by the variance of the real random variable , and by the covariance of the real random variables and . We call variance of and we denote by the symmetric bilinear form on defined by:
Let , we denote by its continuity modulus, which is defined by:
where stands for the geodesic distance on .
Since is compact, is well-defined for every . Moreover every is uniformly continuous and we have:
Note that, if is Lipschitz continuous, then as .
Let be a linear map between two Euclidean spaces, we denote by the Jacobian of :
where is the adjoint operator of .
See Section 4.1 for a quick discussion of the properties of this Jacobian. If is an element of , the space of matrices of size with real coefficients, we denote by the Jacobian of the linear map from to associated with in the canonical bases of and .
For every , we define to be a centered Gaussian vector in with variance matrix:
where is the identity matrix of size . That is, if we denote by (resp. ) the coefficients of (resp. ), the couples with and are independent from one another and the variance matrix of is:
We set .
We can now state our main result.
Let be a complex projective manifold of dimension defined over the reals, we assume that its real locus is non-empty. Let be a rank Hermitian vector bundle and let be a positive Hermitian line bundle, both equipped with compatible real structures. For every , let be a standard Gaussian vector in .
Let , then there exists such that, for all , for all and , the following holds as :
Moreover the error terms and in (1.5) do not depend on .
We obtain the variance of the volume of by applying Thm. 1.6 to . When we get the following.
Corollary 1.7 (Variance of the linear statistics).
In the same setting as Thm. 1.6, let , then there exists such that, for all and all , the following holds as :
Moreover, the error terms do not depend on .
Some remarks are in order.
The value of the constant should not be taken too seriously. This constant appears for technical reasons and it is probably far from optimal.
If is Lipschitz continuous with Lipschitz constant , then the error term in eq. (1.5) can be replaced by:
by fixing , which is possible since .
The fact that the constant is finite is part of the statement and is proved below (Lemma 4.25). This constant is necessarily non-negative. Numerical evidence suggests that it is positive but we do not know how to prove it at this point.
Corollary 1.9 (Concentration in probability).
In the same setting as Thm. 1.6, let and let . Then, for every , we have:
where the error term is independent of , but depends on .
In the same setting as Thm. 1.6, let be an open subset, then as we have:
Our last corollary is concerned with the convergence of a random sequence of sections of increasing degree. Let us denote by the standard Gaussian measure on (see (2.4)). Let denote the product measure on . Then we have the following.
Corollary 1.11 (Almost sure convergence).
In the same setting as Thm. 1.6, let us assume that . Let be a random sequence of sections. Then, -almost surely, we have:
That is, -almost surely,
in the sense of the weak convergence of measures.
We expect this result to hold for as well, but our proof fails in this case.
The Kostlan–Shub–Smale polynomials Let us consider the simplest example of our framework. We choose to be the complex projective space , with the real structure defined by the usual conjugation in . Then is the real projective space . Let be the hyperplane line bundle, equipped with its natural real structure and the metric dual to the standard metric on the tautological line bundle over . Then the curvature form of is the Fubini–Study form , normalized so that the induced Riemannian metric is the quotient of the Euclidean metric on the unit sphere of . Let be the rank trivial bundle with the trivial real structure and the trivial metric.
In this setting, the global holomorphic sections of are the complex homogeneous polynomials of degree in variables and those of are -tuples of such polynomials, since is trivial. Finally, the real structures being just the usual conjugations, we have:
where is the space of real homogeneous polynomials of degree in variables. The copies of this space in are pairwise orthogonal for the inner product (2.1). Hence a standard Gaussian in is a -tuple of independent standard Gaussian in .
It is well-known (cf. [3, 4, 13]) that the monomials are pairwise orthogonal for the -inner product (2.1), but not orthonormal. Let , we denote its length by . We also define and . Finally, if , we denote by the multinomial coefficient . Then, an orthonormal basis of for the inner product (2.1) is given by the family:
Thus a standard Gaussian vector in is a random polynomial:
where the coefficients are independent real standard Gaussian variables. Since we are only concerned with the zero set of this random polynomial, we can drop the factor .
Finally, in this setting, is the common zero set of independent random polynomials in of the form:
with independent coefficients distributed according to the real standard Gaussian distribution. Such polynomials are known as the Kostlan–Shub–Smale polynomials. They were introduced in [13, 29] and were actively studied since (cf. [1, 5, 8, 23, 34]).
Related works. As we just said, zero sets of systems of independent random polynomials distributed as (1.8) were studied by Kostlan  and Shub and Smale . The expected volume of these random algebraic manifolds was computed by Kostlan  and their expected Euler characteristic was computed by Podkorytov  in codimension , and by Bürgisser  in higher codimension. Both these results were extended to the setting of the present paper in .
In , Wschebor obtained an asymptotic bound, as the dimension goes to infinity, for the variance of number of real roots of a system of independent Kostlan–Shub–Smale polynomials. Recently, Dalmao  computed an asymptotic of order for the variance of the number of real roots of one Kostlan–Shub–Smale polynomial in dimension . His result is very similar to (1.5), which leads us to think that such a result should hold for . He also proved a central limit theorem for this number of real roots, using Wiener chaos methods.
In [14, thm. 3], Kratz and Leòn considered the level curves of a centered stationary Gaussian field with unit variance on the plane . More precisely, they considered the length of a level curve intersected with some large square . As , they proved asymptotics of order for both the expectation and the variance of this length. They also proved that it satisfies a central limit theorem as . In particular, their result applies to the centered Gaussian field on with correlation function . This field can be seen as the scaling limit, in the sense of , of the centered Gaussian field defined by our random sections, when and .
The study of more general random algebraic submanifolds, obtained as the zero sets of random sections, was pioneered by Shiffman and Zelditch [25, 26, 27]. They considered the integration current over the common complex zero set of independent random sections in , distributed as standard complex Gaussians. In , they computed the asymptotic, as goes to infinity, of the expectation of the associated smooth statistics when . They also provided an upper bound for the variance of these quantities and proved the equivalent of Cor. 1.11 in this complex algebraic setting. In , they gave an asymptotic of order for the variance of the volume of , where is a domain satisfying some regularity conditions. In , they proved a similar asymptotic for the variance of the smooth statistics associated with . When , they deduced a central limit theorem from these estimates and an asymptotic normality result of Sodin and Tsirelson . Finally, in [28, thm. 1.4], Shiffman, Zelditch and Zrebiec proved that the probability that , where is any open subset of , decreases exponentially fast as goes to infinity.
Coming back to our real algebraic setting, one should be able to deduce from the general result of Nazarov and Sodin [21, thm. 3] that, given an open set , the probability that goes to as goes to infinity. Corollary 1.10 gives an upper bound for the convergence rate. In particular, this bounds the probability for to be empty. In the same spirit, Gayet and Welschinger  proved the following result. Let be a fixed diffeomorphism type of codimension submanifold of , let and let denote the geodesic ball of center and radius . Then, the probability that contains a submanifold diffeomorphic to is bounded from below. On the other hand, when and , the Harnack–Klein inequality shows that the number of connected components of is bounded by a polynomial in . In , Gayet and Welschinger proved that the probability for to have the maximal number of connected components decreases exponentially fast with .
Another well-studied model of random submanifolds is that of Riemannian random waves, i.e. zero sets of random eigenfunctions of the Laplacian associated with some eigenvalue . In this setting, Rudnick and Wigman  computed an asymptotic bound, as , for the variance of the volume of a random hypersurface on the flat -dimensional torus . On , this result was improved by Krishnapur, Kurlberg and Wigman  who computed the precise asymptotic of the variance of the length of a random curve. In , Wigman computed the asymptotic variance of the linear statistics associated with a random curve on the Euclidean sphere . His result holds for a large class of test-function that contains the characteristic functions of open sets satisfying some regularity assumption. In relation with Cor. 1.10, Nazarov and Sodin  proved that, on the Euclidean sphere , the number of connected components of a random curve times converges exponentially fast in probability to a deterministic constant as .
About the proof. The idea of the proof is the following. The random section defines a centered Gaussian field . The correlation kernel of this field equals the Bergman kernel, that is the kernel of the orthogonal projection onto for the inner product (2.1) (compare [3, 16, 25, 26, 27]).
In order to compute the covariance of the smooth statistics and , we apply a Kac–Rice formula (cf. [2, 3, 8, 32, 33]). This allows us to write as the integral over of some function , defined by (4.9). This density is the difference of two terms, coming respectively from
Since the Bergman kernel decreases exponentially fast outside of the diagonal in (see Section 3.4), the values of and are almost uncorrelated for far from . As a consequence, when the distance between and is much larger than , the above two terms in the expression of are equal, up to a small error (see Sect. 4.3.2 for a precise statement). Thus, is small far from , and its integral over this domain only contributes a remainder term to .
The main contribution to the value of comes from the integration of over a neighborhood of of size about . We perform a change of variable in order to express this term as an integral over a domain of fixed size. This rescaling by explains the factor in (1.5). Besides, the order of growth of close to is , that is the order of growth of the square of (see Thm. 1.1). Finally, we get an order of growth of for . The constant in (1.5) appears as the scaling limit of the integral of over a neighborhood of of typical size .
The difficulty in making this sketch of proof rigorous comes from the combination of two facts. First, we do not know exactly the value of the Bergman kernel (our correlation function) and its derivatives, but only asymptotics. In addition, the conditioning in the Kac–Rice formula is singular along , and so is . Because of this, we lose all uniformity in the control of the error terms close to the diagonal. Nonetheless, by careful bookkeeping of the error terms, we can make the above heuristic precise.
Outline of the paper. In Section 2 we describe precisely our framework and the construction of the random measures . We also introduce the Bergman kernel and explain how it is related to our random submanifolds.
In Section 3, we recall various estimates for the Bergman kernel that we use in the proof of our main theorem. These estimates were established by Dai, Liu and Ma , and Ma and Marinescu [17, 18, 19] in a complex setting. Our main contribution in this section consists in checking that the preferred trivialization used by Ma and Marinescu to state their near-diagonal estimates is well-behaved with respect to the real structures on , and (see Section 3.1).
Section 4 is concerned with the proof of Thm. 1.6. In Sect. 4.1, we prove a Kac–Rice formula adapted to our problem, using Federer’s coarea formula and Kodaira’s embedding theorem. In Sect. 4.2 we prove an integral formula for the variance, using the Kac–Rice formula (Thm. 4.4). The core of the proof is contained in Sect. 4.3.
Acknowledgments. I am thankful to Damien Gayet for his guidance in the course of this work and for countless mathematical discussions, on this topic and others.
- 1 Introduction
- 2 Random real algebraic submanifolds
- 3 Estimates for the Bergman kernel
4 Proof of Theorem 1.6
- 4.1 The Kac–Rice formula
- 4.2 An integral formula for the variance
- 4.3 Asymptotic for the variance
- 5 Proofs of the corollaries
2 Random real algebraic submanifolds
2.1 General setting
In this section, we introduce our framework. It is the same as the algebraic setting of , see also [11, 10]. Classical references for the material of this section are [12, chap. 0] and [30, chap. 1].
Let be a smooth complex projective manifold of complex dimension . We assume that is defined over the reals, that is is equipped with an anti-holomorphic involution . The real locus of is the set of fixed points of . In the sequel, we assume that it is non-empty and we denote it by . It is a classical fact that is a smooth closed (i.e. compact without boundary) submanifold of of real dimension (see [30, chap. 1]).
Let be a holomorphic vector bundle of rank . Let be a real structure on , compatible with in the sense that the projection satisfies and is fiberwise -anti-linear. Let be a real Hermitian metric on , that is .
Similarly, let be an ample holomorphic line bundle equipped with a compatible real structure and a real Hermitian metric . Moreover, we assume that the curvature form of is a Kähler form. Recall that if is any non-vanishing holomorphic section on the open set , then the restriction of to is given by:
This Kähler form is associated with a Hermitian metric on . The real part of defines a Riemannian metric on , compatible with the complex structure. Note that, since is compatible with the real structures on and , we have and . Then we have , hence and is an isometry of .
Then induces a Riemannian measure on every smooth submanifold of . In the case of , this measure is given by the volume form . We denote by the Riemannian measure on .
Let , then the rank holomorphic vector bundle can be endowed with a real structure , compatible with , and a real Hermitian metric . If , then induces a -anti-linear involution of the fiber . We denote by the fixed points set of this involution, which is a dimension real vector space.
Let denote the space of smooth sections of . We can define a Hermitian inner product on by:
We say that a section is real if it is equivariant for the real structures, that is: . Let denote the real vector space of real smooth sections of . The restriction of to is a Euclidean inner product.
In this paper, will always denote either the inner product on the concerned Euclidean (or Hermitian) space or the duality pairing between a space and its topological dual. Which one will be clear from the context.
Let denote the space of global holomorphic sections of . This space has finite complex dimension by Hodge’s theory (compare [17, thm. 1.4.1]). We denote by the space of global real holomorphic sections of :
The restriction of the inner product (2.1) to makes it into a Euclidean space of real dimension .
Note that, even when we consider real sections restricted to , the inner product is defined by integrating on the whole complex manifold .
2.2 Random submanifolds
This section is concerned with the definition of the random submanifolds we consider and the related random variables.
Let and , we denote the real zero set of by . If the restriction of to vanishes transversally, then is a smooth submanifold of codimension of . In this case, we denote by the Riemannian measure on induced by , seen as a Radon measure on . Note that this includes the case where is empty.
Recall the following facts, that we already discussed in .
Definition 2.3 (see ).
We say that is -ample if, for any , the evaluation map
Lemma 2.4 (see , cor. 3.10).
There exists , depending only on , and , such that for all , is -ample.
Lemma 2.5 (see , section 2.6).
If is -ample, then for almost every section (for the Lebesgue measure), the restriction of to vanishes transversally.
From now on, we only consider the case , so that is a well-defined measure on for almost every . Let be a standard Gaussian vector in , that is is a random vector whose distribution admits the density:
with respect to the normalized Lebesgue measure on . Here is the norm associated with the Euclidean inner product (2.1). Then is almost surely a submanifold of codimension of and is almost surely a random positive Radon measure on . To simplify notations, we set and . For more details concerning Gaussian vectors, we refer to [16, appendix A] and the references therein.
Let , for every vanishing transversally, we set
Such a will be refered to as a test-function. Following , we call linear statistic of degree associated with the real random variable .
2.3 The correlation kernel
Let , then is a smooth centered Gaussian field on . As such, it is characterized by its correlation kernel. In this section, we recall that the correlation kernel of equals the Bergman kernel of . This is now a well-known fact (see [3, 11, 25, 27]) and was already used by the author in .
Let us first recall some facts about random vectors (see for example [16, appendix A]). In this paper, we only consider centered random vectors (that is their expectation vanishes), so we give the following definitions in this restricted setting. Let and be centered random vectors taking values in Euclidean (or Hermitian) vector spaces and respectively, then we define their covariance operator as:
from to . For every , we set . Then is an element of . The variance operator of a centered random vector is defined as . We denote by the fact that is a centered Gaussian vector with variance operator . Finally, we say that is a standard Gaussian vector if , where is the identity operator on . A standard Gaussian vector admits the density (2.4) with respect to the normalized Lebesgue measure on .
Recall that stands for the bundle over , where (resp. ) denotes the projection from onto the first (resp. second) factor. The covariance kernel of is the section of defined by:
The orthogonal projection from onto admits a Schwartz kernel (see [17, thm. B.2.7]). That is, there exists a unique section of such that, for any , the projection of onto