Lower deviation and moderate deviation probabilities for maximum of a branching random walk

Lower deviation and moderate deviation probabilities for maximum of a branching random walk

Xinxin Chen and Hui He

Abstract: Given a super-critical branching random walk on started from the origin, let be the maximal position of individuals at the -th generation. Under some mild conditions, it is known from [2] that as , converges in law for some suitable constants and . In this work, we investigate its moderate deviation, in other words, the convergence rates of

for any positive sequence such that and . As a by-product, we also obtain lower deviation of ; i.e., the convergence rate of

for in Böttcher case where the offspring number is at least two. Finally, we apply our techniques to study the small ball probability of limit of derivative martingale.

Mathematics Subject Classifications (2010): 60J60; 60F10.

Key words and phrases: Branching random walk; maximal position; moderate deviation; lower deviation; Schröder case; Böttcher case; small ball probability; derivative martingale.

1 Introduction

1.1 Branching random walk and its maximum

We consider a discrete-time branching random walk on the real line, which, as a generalized branching process, has been always a very attractive objet in probability theory in recent years. It is closely related to many other random models, for example, random walk in random environment, random fractals and discrete Gaussian free field; see [26], [31], [34], [11] and [3] references therein. One can refer to [39] and [40] for the recent developments on branching random walk.

Generally, to construct a branching random walk, we take a random point measure as the reproduction law which describes both the number of children and their displacements. Each individual produces independently its children according to the law of this random point measure. In this way, one develops a branching structure with motions.

In this work, we study a relatively simpler model which is constructed as follows. We take a Galton-Watson tree , rooted at , with offspring distribution given by . For any , we write if is an ancestor of or . Moreover, to each node , we attach a real-valued random variable to represent its displacement. So the position of is defined by

Let for convenience. Suppose that given the tree , are i.i.d. copies of some random variable (which is called displacement or step size). Note here that the reproduction law is given by . Thus, is our branching random walk with independence between offsprings and motions. This independence will be necessary for our arguments.

For any , let be the maximal position at the -th generation, in other words,

where denotes the generation of node , i.e., the graph distance between and . The asymptotics of have been studied by many authors, both in the subcritical/critical case and in supercritical case. One can refer to [30], [37] and [39] for more details.

We are interested in the supercritical case where and the system survives with positive probability. Let be a random walk started from with i.i.d. increments distributed as . Observe that for any individual of the -th generation, is distributed as . If , classical law of large number tells us that almost surely. However, as there are too many individuals in this supercritical system, the asymptotical behavior of is not as that of .

Conditionally on survival, under some mild conditions, it is known from [24, 29, 6] that

where is a constant depending on both offspring law and displacement. Later, the logarithmic order of is given by [1], [27] in different ways. Aïdékon in [2] showed that converges in law for some suitable , which is an analogue of Bramson’s result for branching Brownian motion in [10]; see also [12]. More details on these results will be given in Section 2.

For maximum of branching Brownian motion, Chauvin and Rouault [13] first studied the large deviation probability. Recently, Derrida and Shi [16, 17, 18] considered both the large deviation and lower deviation. They established precise estimations. On the other hand, for branching random walk, Hu in [25] studied the moderate deviation for ; i.e.; with . Later, Gantert and Höfelsauer [23] and Bhattacharya [5] investigated large deviation probability for . In the same paper [23], Gantert and Höfelsauer also studied the lower deviation probability for mainly in Schröder case when . In fact, branching random walk in Schröder case can be viewed as a generalized version of branching Brownian motion.

Motivated by [25], [23] and [17], the goal of this article is to study moderate deviation with . As we already mentioned, [25] first considered this problem with ; see Remarks 1.2 and 1.6 below for more details. As a by-product of our main results, in Böttcher case when , we also obtain the lower deviation of , i.e., for , which completes the work [23]. We shall see that the lower deviation of in Böttcher case turns to be very different from that in Schröder case. In fact, Gantert and Höfelsauer [23] proved that in Schröder case decays exponentially. On contrast, in Böttcher case, we shall show that may decay double-exponentially or super-exponentially depending on the tail behaviors of step size . We will consider three typical left tail distributions of the step size and obtain the corresponding decay rates and rate functions. Finally, we also apply our techniques to study the small ball probability for the limit of derivative martingale. The corresponding problem was also considered in [25] for a class of Mandelbrot’s cascades in Böttcher case with bounded step size and in Schröder case; see also [32] and [33] for more backgrounds. Let us state the theorems in the following subsection.

As usual, or means that for all . means that is bounded above and below by a positive and finite constant multiple of for all . or means .

1.2 Main results

Suppose that we are in the supercritical case where the tree survives with positive probability. Formally, we assume that for the offspring law :


At the same time, suppose that for the step size ,


for some . We define the rate function of large deviation for the corresponding random walk with i.i.d. step sizes by

Then it is known from Theorem 3.1 in [8] that under (1.1) and (1.2),

where . Note that if , then since is continuous in , and


According to Theorem 4.1 in [8], it further follows from (1.3) that -a.s.,

which fails if and . Besides (1.1), (1.2) and (1.3), if we further suppose that


then it is shown in [1] and [27] that where


Define the so-called derivative martingale by

It is known from [9] and [2] that under assumptions (1.1), (1.2), (1.3) and (1.4), there exists a non-negative random variable such that

where a.s. Next, given (1.1), (1.2), (1.3) and (1.4), Aïdékon [2] proved the convergence in law of as follows. For any ,


where is a constant. In this work, we are going to study the asymptotic of for , as well as that of which is closely related to by (1.6). Let us introduce the minimal offspring for :

We first present the main results in Böttcher case where and .

Theorem 1.1 (Böttcher case, bounded step size).

Assume (1.1), (1.2) and . Suppose that for some , then for ,


If , then (1.7) holds also for

Remark 1.1.

Note that the assumptions (1.1) and (1.2) do not imply the second logarithmic order of .

Theorem 1.2 (Bounded step size: moderate deviation).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose that for some . Then for any positive increasing sequence such that and ,


where because of (1.3).

Remark 1.2.

Hu [25] obtained this moderate deviation (1.8) for in a more general setting with bounded step size and without assuming independence between offsprings and motions. One could check that is coherent with that defined in (1.10) of [25].

Remark 1.3.

Suppose that all assumptions in Theorem 1.2 hold. Then by Theorem 1.3 in [25], we have

Theorem 1.3 (Böttcher case, Weibull left tail).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose as for some constant and . Then for any positive increasing sequence such that and ,


where for convenience, for . In particular, for any ,

Remark 1.4.

Note that if , the assumption (1.2) can not be satisfied and we are in another regime where grows faster than linear in time; see [22].

The weak convergence (1.6) shows that and are closely related. So inspired by Theorem 1.3, one obtains the following result.

Proposition 1.4 (Böttcher case, Weibull left tail).

Suppose that all assumptions in Theorem 1.3 hold. Then

Theorem 1.5 (Böttcher case, Gumbel left tail).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose as for some constant . Then for any positive increasing sequence such that and ,


In particular, for any ,


Again, inspired by Theorem 1.5 and the weak convergence (1.6), we have the following result.

Proposition 1.6 (Böttcher case, Gumbel left tail).

Suppose that all assumptions in Theorem 1.5 hold. Then


Next theorem concerns the Schröder case where . Let be the extinction probability and , be the generating function of offspring. Let . Denote by for any real number .

Theorem 1.7 (Schröder case).

Assume (1.1), (1.2), (1.3), (1.4) and . Then for any positive sequence such that and that exists with , we have


where and


In particular, we have for any ,

Remark 1.5.

(1.17) was obtained first by Gantert and Höfelsauer in [23]. In fact, it is shown in [23] that for any ,

Then one can check that

Remark 1.6.

When , (1.15) was obtained by Hu in [25] in a more general framework. In fact, if restricted to our setting, then conditions (1.5) and (1.6) in [25] is equivalent to say that there exists a constant such that

Since , then So conditions (1.5) and (1.6) in [25] make sure that is exactly the of on ; i.e.;

Remark 1.7.

If all assumptions in Theorem 1.7 hold, then by Theorem 1.3 in [25], we have

General strategy:

Let us explain our main ideas here, especially for in Böttcher case. Intuitively, to get an unusually lower maximum, we need to control both the size of the genealogical tree and the displacements of individuals. More precisely, we need that at the very beginning, the size of the genealogical tree is small with all individuals moving to some atypically lower place. So, we take some intermediate time and suppose that the genealogical tree is -regular up to time and that all individuals at time are located below certain “critical” position . Then the system continues with i.i.d. branching random walks started from places below . By choosing and in an appropriate way, we can expect that the maximum at time stays below with high probability.

Note that, the time varies in different cases. If the step size is bounded from below, . If the step size has Weibull tail or Gumbel tail, .

Our arguments and techniques are also inspired by [14] where we studied the large deviation of empirical distribution of branching random walk. All these ideas work also for studying the small ball probability of .

The rest of this paper is organized as follows. We treat the cases with bounded step size in Section 2. Then, Section 3 proves Theorems 1.3 and 1.5, concerning the cases with unbounded step size. In Section 4, we study and prove Propositions 1.4 and 1.6. Finally, we prove Theorems 1.7 for Schröder case in Section 5.

Let and denote positive constants whose values may change from line to line.

2 Böttcher case with step size bounded on the left side: Proofs of Theorems 1.1, 1.2:

In this section, we always suppose that and with . Assumption (1.2) yields that with . We are going to prove that for any ,


with . Next, for the second order of , there are several regimes. We assume (1.3) and (1.4) to get the classical one: with . In this regime, we are going to prove that for any positive sequence such that ,


The proofs of (2.1) and (2.2) basically follow the same ideas. But (2.1) needs to be treated in a more general regime, without second order estimates.

For later use, let us introduce the counting measures as follows: for any ,

For simplicity, we write for to represent the total population of the -th generation. It is clear that . For any , let

be the maximal relative position of descendants of . Clearly, is distributed as .

2.1 Proof of Theorem 1.1

In this section, we show that for any , (2.1) holds. We use to denote the intermediate time chosen for the lower bounds and for upper bounds.

2.1.1 Lower bound of Theorem 1.1

As , let with any sufficiently small such that . Notice that implies that for any . Observe that for some intermediate time , whose value will be determined later, if we let every individual before the -th generation make a displacement less than , then

where are i.i.d. copies of . By Markov property at time , one gets that


Next, we shall estimate . The sequel of this proof will be divided into two subparts depending on whether or not, respectively.

Subpart 1: the case . Note that we have now. Take so that . Thus,

Going back to (2.3), one sees that for some ,


It follows readily that for any ,


Letting yields what we need.

Subpart 2: the case . Now we have because is finite and continuous in . Moreover, for some . For any sufficiently small , one has

Recall that . Let and so that for all large enough. Therefore,

Here we apply the large deviation result obtained in [23]. More precisely, as the maximum of independent random walks dominates stochastically , one has

which yields that

Note that for any . Let . Then for all sufficiently large ,

Plugging this into (2.3) implies that


Thus we have


Since , letting (hence and ) gives

which implies the desired lower bound because is arbitrary small.

2.1.2 Upper bound of Theorem 1.1

In this section, we show that

Note that for any , is supported by a.s. Moreover, . Observe that


It remains to estimate . Again, the proof will be divided into two subparts.

Subpart 1: the case . By taking so that , one has


where we use the fact that for some and all . In fact, we could construct a Galton-Watson tree with offspring . Here since . Its survival probability is positive if . Even when , it is critical and the survival probability up to generation is larger than for some and for all . In fact, its survival up to generation implies that some individual at time has position . So, . We hence conclude from (2.1.2) and (2.11) that

Subpart 2: the case . First recall a result from [23]( see Theorem 3.2) which says that


So for any sufficiently small such that , for any , let and so that . Then for all large enough,


where the second inequality follows from (2.13). Plugging (2.1.2) into (2.1.2) yields that

Again letting (hence and ) gives the desired upper bound.

If , then the arguments for lower bound work well for and