Lower deviation and moderate deviation probabilities for maximum of a branching random walk

Lower deviation and moderate deviation probabilities for maximum of a branching random walk

Xinxin Chen and Hui He

Abstract: Given a super-critical branching random walk on started from the origin, let be the maximal position of individuals at the -th generation. Under some mild conditions, it is known from [2] that as , converges in law for some suitable constants and . In this work, we investigate its moderate deviation, in other words, the convergence rates of

for any positive sequence such that and . As a by-product, we also obtain lower deviation of ; i.e., the convergence rate of

for in Böttcher case where the offspring number is at least two. Finally, we apply our techniques to study the small ball probability of limit of derivative martingale.


Mathematics Subject Classifications (2010): 60J60; 60F10.


Key words and phrases: Branching random walk; maximal position; moderate deviation; lower deviation; Schröder case; Böttcher case; small ball probability; derivative martingale.

1 Introduction

1.1 Branching random walk and its maximum

We consider a discrete-time branching random walk on the real line, which, as a generalized branching process, has been always a very attractive objet in probability theory in recent years. It is closely related to many other random models, for example, random walk in random environment, random fractals and discrete Gaussian free field; see [26], [31], [34], [11] and [3] references therein. One can refer to [39] and [40] for the recent developments on branching random walk.

Generally, to construct a branching random walk, we take a random point measure as the reproduction law which describes both the number of children and their displacements. Each individual produces independently its children according to the law of this random point measure. In this way, one develops a branching structure with motions.

In this work, we study a relatively simpler model which is constructed as follows. We take a Galton-Watson tree , rooted at , with offspring distribution given by . For any , we write if is an ancestor of or . Moreover, to each node , we attach a real-valued random variable to represent its displacement. So the position of is defined by

Let for convenience. Suppose that given the tree , are i.i.d. copies of some random variable (which is called displacement or step size). Note here that the reproduction law is given by . Thus, is our branching random walk with independence between offsprings and motions. This independence will be necessary for our arguments.

For any , let be the maximal position at the -th generation, in other words,

where denotes the generation of node , i.e., the graph distance between and . The asymptotics of have been studied by many authors, both in the subcritical/critical case and in supercritical case. One can refer to [30], [37] and [39] for more details.

We are interested in the supercritical case where and the system survives with positive probability. Let be a random walk started from with i.i.d. increments distributed as . Observe that for any individual of the -th generation, is distributed as . If , classical law of large number tells us that almost surely. However, as there are too many individuals in this supercritical system, the asymptotical behavior of is not as that of .

Conditionally on survival, under some mild conditions, it is known from [24, 29, 6] that

where is a constant depending on both offspring law and displacement. Later, the logarithmic order of is given by [1], [27] in different ways. Aïdékon in [2] showed that converges in law for some suitable , which is an analogue of Bramson’s result for branching Brownian motion in [10]; see also [12]. More details on these results will be given in Section 2.

For maximum of branching Brownian motion, Chauvin and Rouault [13] first studied the large deviation probability. Recently, Derrida and Shi [16, 17, 18] considered both the large deviation and lower deviation. They established precise estimations. On the other hand, for branching random walk, Hu in [25] studied the moderate deviation for ; i.e.; with . Later, Gantert and Höfelsauer [23] and Bhattacharya [5] investigated large deviation probability for . In the same paper [23], Gantert and Höfelsauer also studied the lower deviation probability for mainly in Schröder case when . In fact, branching random walk in Schröder case can be viewed as a generalized version of branching Brownian motion.

Motivated by [25], [23] and [17], the goal of this article is to study moderate deviation with . As we already mentioned, [25] first considered this problem with ; see Remarks 1.2 and 1.6 below for more details. As a by-product of our main results, in Böttcher case when , we also obtain the lower deviation of , i.e., for , which completes the work [23]. We shall see that the lower deviation of in Böttcher case turns to be very different from that in Schröder case. In fact, Gantert and Höfelsauer [23] proved that in Schröder case decays exponentially. On contrast, in Böttcher case, we shall show that may decay double-exponentially or super-exponentially depending on the tail behaviors of step size . We will consider three typical left tail distributions of the step size and obtain the corresponding decay rates and rate functions. Finally, we also apply our techniques to study the small ball probability for the limit of derivative martingale. The corresponding problem was also considered in [25] for a class of Mandelbrot’s cascades in Böttcher case with bounded step size and in Schröder case; see also [32] and [33] for more backgrounds. Let us state the theorems in the following subsection.

As usual, or means that for all . means that is bounded above and below by a positive and finite constant multiple of for all . or means .

1.2 Main results

Suppose that we are in the supercritical case where the tree survives with positive probability. Formally, we assume that for the offspring law :

(1.1)

At the same time, suppose that for the step size ,

(1.2)

for some . We define the rate function of large deviation for the corresponding random walk with i.i.d. step sizes by

Then it is known from Theorem 3.1 in [8] that under (1.1) and (1.2),

where . Note that if , then since is continuous in , and

(1.3)

According to Theorem 4.1 in [8], it further follows from (1.3) that -a.s.,

which fails if and . Besides (1.1), (1.2) and (1.3), if we further suppose that

(1.4)

then it is shown in [1] and [27] that where

(1.5)

Define the so-called derivative martingale by

It is known from [9] and [2] that under assumptions (1.1), (1.2), (1.3) and (1.4), there exists a non-negative random variable such that

where a.s. Next, given (1.1), (1.2), (1.3) and (1.4), Aïdékon [2] proved the convergence in law of as follows. For any ,

(1.6)

where is a constant. In this work, we are going to study the asymptotic of for , as well as that of which is closely related to by (1.6). Let us introduce the minimal offspring for :

We first present the main results in Böttcher case where and .

Theorem 1.1 (Böttcher case, bounded step size).

Assume (1.1), (1.2) and . Suppose that for some , then for ,

(1.7)

If , then (1.7) holds also for

Remark 1.1.

Note that the assumptions (1.1) and (1.2) do not imply the second logarithmic order of .

Theorem 1.2 (Bounded step size: moderate deviation).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose that for some . Then for any positive increasing sequence such that and ,

(1.8)

where because of (1.3).

Remark 1.2.

Hu [25] obtained this moderate deviation (1.8) for in a more general setting with bounded step size and without assuming independence between offsprings and motions. One could check that is coherent with that defined in (1.10) of [25].

Remark 1.3.

Suppose that all assumptions in Theorem 1.2 hold. Then by Theorem 1.3 in [25], we have

Theorem 1.3 (Böttcher case, Weibull left tail).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose as for some constant and . Then for any positive increasing sequence such that and ,

(1.9)

where for convenience, for . In particular, for any ,

(1.10)
Remark 1.4.

Note that if , the assumption (1.2) can not be satisfied and we are in another regime where grows faster than linear in time; see [22].

The weak convergence (1.6) shows that and are closely related. So inspired by Theorem 1.3, one obtains the following result.

Proposition 1.4 (Böttcher case, Weibull left tail).

Suppose that all assumptions in Theorem 1.3 hold. Then

(1.11)
Theorem 1.5 (Böttcher case, Gumbel left tail).

Assume (1.1), (1.2), (1.3), (1.4) and . Suppose as for some constant . Then for any positive increasing sequence such that and ,

(1.12)

In particular, for any ,

(1.13)

Again, inspired by Theorem 1.5 and the weak convergence (1.6), we have the following result.

Proposition 1.6 (Böttcher case, Gumbel left tail).

Suppose that all assumptions in Theorem 1.5 hold. Then

(1.14)

Next theorem concerns the Schröder case where . Let be the extinction probability and , be the generating function of offspring. Let . Denote by for any real number .

Theorem 1.7 (Schröder case).

Assume (1.1), (1.2), (1.3), (1.4) and . Then for any positive sequence such that and that exists with , we have

(1.15)

where and

(1.16)

In particular, we have for any ,

(1.17)
Remark 1.5.

(1.17) was obtained first by Gantert and Höfelsauer in [23]. In fact, it is shown in [23] that for any ,

Then one can check that

Remark 1.6.

When , (1.15) was obtained by Hu in [25] in a more general framework. In fact, if restricted to our setting, then conditions (1.5) and (1.6) in [25] is equivalent to say that there exists a constant such that

Since , then So conditions (1.5) and (1.6) in [25] make sure that is exactly the of on ; i.e.;

Remark 1.7.

If all assumptions in Theorem 1.7 hold, then by Theorem 1.3 in [25], we have

General strategy:

Let us explain our main ideas here, especially for in Böttcher case. Intuitively, to get an unusually lower maximum, we need to control both the size of the genealogical tree and the displacements of individuals. More precisely, we need that at the very beginning, the size of the genealogical tree is small with all individuals moving to some atypically lower place. So, we take some intermediate time and suppose that the genealogical tree is -regular up to time and that all individuals at time are located below certain “critical” position . Then the system continues with i.i.d. branching random walks started from places below . By choosing and in an appropriate way, we can expect that the maximum at time stays below with high probability.

Note that, the time varies in different cases. If the step size is bounded from below, . If the step size has Weibull tail or Gumbel tail, .

Our arguments and techniques are also inspired by [14] where we studied the large deviation of empirical distribution of branching random walk. All these ideas work also for studying the small ball probability of .

The rest of this paper is organized as follows. We treat the cases with bounded step size in Section 2. Then, Section 3 proves Theorems 1.3 and 1.5, concerning the cases with unbounded step size. In Section 4, we study and prove Propositions 1.4 and 1.6. Finally, we prove Theorems 1.7 for Schröder case in Section 5.

Let and denote positive constants whose values may change from line to line.

2 Böttcher case with step size bounded on the left side: Proofs of Theorems 1.1, 1.2:

In this section, we always suppose that and with . Assumption (1.2) yields that with . We are going to prove that for any ,

(2.1)

with . Next, for the second order of , there are several regimes. We assume (1.3) and (1.4) to get the classical one: with . In this regime, we are going to prove that for any positive sequence such that ,

(2.2)

The proofs of (2.1) and (2.2) basically follow the same ideas. But (2.1) needs to be treated in a more general regime, without second order estimates.

For later use, let us introduce the counting measures as follows: for any ,

For simplicity, we write for to represent the total population of the -th generation. It is clear that . For any , let

be the maximal relative position of descendants of . Clearly, is distributed as .

2.1 Proof of Theorem 1.1

In this section, we show that for any , (2.1) holds. We use to denote the intermediate time chosen for the lower bounds and for upper bounds.

2.1.1 Lower bound of Theorem 1.1

As , let with any sufficiently small such that . Notice that implies that for any . Observe that for some intermediate time , whose value will be determined later, if we let every individual before the -th generation make a displacement less than , then

where are i.i.d. copies of . By Markov property at time , one gets that

(2.3)
(2.4)
(2.5)

Next, we shall estimate . The sequel of this proof will be divided into two subparts depending on whether or not, respectively.

Subpart 1: the case . Note that we have now. Take so that . Thus,

Going back to (2.3), one sees that for some ,

(2.6)

It follows readily that for any ,

(2.7)

Letting yields what we need.

Subpart 2: the case . Now we have because is finite and continuous in . Moreover, for some . For any sufficiently small , one has

Recall that . Let and so that for all large enough. Therefore,

Here we apply the large deviation result obtained in [23]. More precisely, as the maximum of independent random walks dominates stochastically , one has

which yields that

Note that for any . Let . Then for all sufficiently large ,

Plugging this into (2.3) implies that

(2.8)

Thus we have

(2.9)

Since , letting (hence and ) gives

which implies the desired lower bound because is arbitrary small.

2.1.2 Upper bound of Theorem 1.1

In this section, we show that

Note that for any , is supported by a.s. Moreover, . Observe that

(2.10)

It remains to estimate . Again, the proof will be divided into two subparts.

Subpart 1: the case . By taking so that , one has

(2.11)
(2.12)

where we use the fact that for some and all . In fact, we could construct a Galton-Watson tree with offspring . Here since . Its survival probability is positive if . Even when , it is critical and the survival probability up to generation is larger than for some and for all . In fact, its survival up to generation implies that some individual at time has position . So, . We hence conclude from (2.1.2) and (2.11) that

Subpart 2: the case . First recall a result from [23]( see Theorem 3.2) which says that

(2.13)

So for any sufficiently small such that , for any , let and so that . Then for all large enough,

(2.14)
(2.15)
(2.16)

where the second inequality follows from (2.13). Plugging (2.1.2) into (2.1.2) yields that

Again letting (hence and ) gives the desired upper bound.

If , then the arguments for lower bound work well for and