Cutoff for the East process
Abstract.
The East process is a 1d kinetically constrained interacting particle system, introduced in the physics literature in the early 90’s to model liquidglass transitions. Spectral gap estimates of Aldous and Diaconis in 2002 imply that its mixing time on sites has order . We complement that result and show cutoff with an window.
The main ingredient is an analysis of the front of the process (its rightmost zero in the setup where zeros facilitate updates to their right). One expects the front to advance as a biased random walk, whose normal fluctuations would imply cutoff with an window. The law of the process behind the front plays a crucial role: Blondel showed that it converges to an invariant measure , on which very little is known. Here we obtain quantitative bounds on the speed of convergence to , finding that it is exponentially fast. We then derive that the increments of the front behave as a stationary mixing sequence of random variables, and a Steinmethod based argument of Bolthausen (‘82) implies a CLT for the location of the front, yielding the cutoff result.
Finally, we supplement these results by a study of analogous kinetically constrained models on trees, again establishing cutoff, yet this time with an window.
1. Introduction
The East process is a onedimensional spin system that was introduced in the physics literature by Jäckle and Eisinger [JE91] in 1991 to model the behavior of cooled liquids near the glass transition point, specializing a class of models that goes back to [FH]. Each site in has a value (vacant/occupied), and, denoting this configuration by , the process attempts to update to at rate (a parameter) and to at rate , only accepting the proposed update if (a “kinetic constraint”).
It is the properties of the East process before and towards reaching equilibrium — it is reversible w.r.t. , the product of Bernoulli() variables — which are of interest, with the standard gauges for the speed of convergence to stationarity being the inverse spectralgap and the totalvariation mixing time ( and ) on a finite interval , where we fix for ergodicity (postponing formal definitions to §2). That the spectralgap is uniformly bounded away from 0 for any was first proved in a beautiful work of Aldous and Diaconis [AD02] in 2002. This implies that is of order for any fixed threshold for the totalvariation distance from .
For a configuration with , call this rightmost 0 its front ; key questions on the East process revolve the law of the sites behind the front at time , basic properties of which remain unknown. One can imagine that the front advances to the right as a biased walk, behind which (its trail is mixed). Indeed, if one (incorrectly!) ignores dependencies between sites as well as the randomness in the position of the front, it is tempting to conclude that converges to , since upon updating a site its marginal is forever set to Bernoulli(). Whence, the positive vs. negative increments to would have rates (a 0update at ) vs. (a 1update at with a 0 at its left), giving the front an asymptotic speed .
Of course, ignoring the irregularity near the front is problematic, since it is precisely the distribution of those spins that governs the speed of the front (hence mixing). Still, just as a biased random walk, one expects the front to move at a positive speed with normal fluctuations, whence its concentrated passage time through an interval would imply totalvariation cutoff — a sharp transition in mixing — within an window.
To discuss the behavior behind the front, let denote the set of configurations on the negative halfline with a fixed 0 at the origin, and let evolve via the East process constantly recentered (shifted by at most 1) to keep its front at the origin. Blondel [Blondel] showed (see Theorem 2.1) that the process converges to an invariant measure , on which very little is known, and that converges in probability to a positive limiting value as (an asymptotic velocity) given by the formula
(We note that by the invariance of the measure and the fact that .)
The East process of course entails the joint distribution of and ; thus, it is crucial to understand the dependencies between these as well as the rate at which converges to as a prerequisite for results on the fluctuations of .
Our first result confirms the biased random walk intuition for the front of the East process , establishing a CLT for its fluctuations around (illustrated in Fig. 1).
Theorem 1.
There exists a nonnegative constant such that for all ,
(1.1)  
(1.2)  
(1.3) 
Moreover, obeys a central limit theorem:
(1.4) 
A key ingredient for the proof is a quantitative bound on the rate of convergence to , showing that it is exponentially fast (Theorem 3.1). We then show that the increments
(1.5) 
behave (after an initial burnin time) as a stationary sequence of weakly dependent random variables (Corollary 3.2), whence one can apply an ingenious Stein’smethod based argument of Bolthausen [Bolthausen] from 1982 to derive the CLT.
Moving our attention to finite volume, recall that the cutoff phenomenon (coined by Aldous and Diaconis [AD86]; see [Aldous, DiSh] as well as [Diaconis] and the references therein) describes a sharp transition in the convergence of a finite Markov chain to stationarity: over a negligible period of time (the cutoff window) the distance from equilibrium drops from near 1 to near . Formally, a sequence of chains indexed by has cutoff around with window if for any fixed .
It is wellknown (see, e.g., [DiFi]*Example 4.46) that a biased random walk with speed on an interval of length has cutoff at with an window due to normal fluctuations. Recalling the heuristics that depicts the front of the East process as a biased walk flushing a law in its trail, one expects precisely the same cutoff behavior. Indeed, the CLT in Theorem 1 supports a result exactly of this form.
Theorem 2.
The East process on with parameter exhibits cutoff at with an window: for any fixed and large enough ,
where is the c.d.f. of and the implicit constant in the depends only on .
While these new results relied on a refined understanding of the convergence of the process behind the front to its invariant law (shown in Fig. 2), various basic questions on remain unanswered. For instance, are the singlesite marginals of monotone in the distance from the front? What are the correlations between adjacent spins? Can one explicitly obtain , thus yielding an expression for the velocity ? For the latter, we remark that the wellknown upper bound on in terms of the spectralgap (Eq. (2.2)), together with Theorem 2, gives the lower bound (cf. also [CFM])
Finally, we accompany the concentration for and cutoff for the East process by analogous results — including cutoff with an window — on the corresponding kinetically constrained models on trees, where a site is allowed to update (i.e., to be reset into a Bernoulli() variable) given a certain configuration of its children (e.g., allzeros/at least one zero/etc.). These results are detailed in §5 (Theorems 5.1–5.2).
Remark.
The concentration and cutoff results for the kinetically constrained models on trees (Theorems 5.1–5.2) do not apply to every scale but rather to infinitely many scales, as is sometimes the case in the context of tightness for maxima of branching random walks or discrete Gaussian Free Fields; see, e.g., [BDZ, DH91] as well as the beautiful method in [BZ1, BZ2] to overcome this hurdle for certain branching random walks. Indeed, similarly to the latter, one of the models here gives rise to a distributional recursion involving the maximum of i.i.d. copies of the random variable of interest, plus a nonnegative increment. Unfortunately, unlike branching random walks, here this increment is not independent of those two copies, and extending our analysis to every scale appears to be quite challenging.
2. Preliminaries and tools for the East process
2.1. Setup and notation
Let and let consist of those configurations such that the variable is finite. In the sequel, for any we will often refer to as the front of . Given and we will write for the restriction of to .

The East process. For any and let denote the indicator of the event . We will consider the Markov process on with generator acting on local functions (i.e. depending on finitely many coordinates) given by
where and are the configurations in obtained from by fixing equal to or to respectively the coordinate at . In the sequel the above process will be referred to as the East process on and we will write for its law when the starting configuration is . Average and variance w.r.t. to will be denoted by and respectively. Similarly we will write and for the law and average at a fixed time . If the starting configuration is distributed according to an initial distribution we will simply write for and similarly for .
It is easily seen that the East process has the following graphical representation. To each we associate a rate1 Poisson process and, independently, a family of independent Bernoulli random variables . The occurrences of the Poisson process associated to will be denoted by . We assume independence as varies in . That fixes the probability space. Notice that almost surely all the occurrences are different. On the above probability we construct a Markov process according to the following rules. At each time the site queries the state of its own constraint . If and only if the constraint is satisfied () then is called a legal ring and the configuration resets its value at site to the value of the corresponding Bernoulli variable . Using the graphical construction it is simple to see that if then

The halfline East process. Consider now and let consist of those configurations with a leftmost zero at . Clearly, for any , because for any . We will refer to the corresponding process in as the East process on the halfline . Notice that in this case the variable at will always be unconstrained because for all . The corresponding generator will be denoted by .

The finite volume East process. Finally, if is a discrete interval of the form , the projection on of the halfline East process on is a continuous time Markov chain because each vertex only queries the state of the spin to its left. In the sequel the above chain will be referred to as the East process in . Let denote the corresponding generator.
The main properties of the above processes can be summarized as follows (cf. [Eastsurvey] for a survey). They are all ergodic and reversible w.r.t. to the product Bernoulli() measure (on the corresponding state space). Their generators are selfadjoint operators on satisfying the following natural ordering:
Remark.
By translation invariance the value of does not depend on and, similarly, depends only on the cardinality of .
As mentioned before, the fact that (but only for ) was first proved by Aldous and Diaconis [AD02], where it was further shown that
(2.1) 
the order of the exponent in the lower bound matching nonrigorous predictions in the physics literature. The positivity of was rederived and extended to all in [CMRT] by different methods, and the correct asymptotics of the exponent as — matching the upper bound in (2.1) — was very recently established in [CFM]. It is easy to check (e.g., from [CMRT]) that , a fact that will be used later on.
For the East process in it is natural to consider its mixing times , , defined by
where denotes totalvariation distance. It is a standard result for reversible Markov chains (see e.g. [AF, LPW, Saloff]) that
(2.2) 
where . In particular . A lower bound which also grows linearly in the length of the interval follows easily from the finite speed of information propagation: If we run the East model in starting from the configuration of except for a zero at the origin, then, in order to create zeros near the right boundary of a sequence of order of successive rings of the Poisson clocks at consecutive sites must have occurred. That happens with probability iff we allow a time which is linear in (see §2.4 and in particular Lemma 2.6).
2.2. The process behind the front
Given two probability measures on and we will write to denote the total variation distance between the marginals of and on .
When the process starts from a initial configuration with a front, it is convenient to define a new process on as the process as seen from the front [Blondel]. Such a process is obtained from the original one by a random shift which forces the front to be always at the origin. More precisely we define on the Markov process with generator given by
where
That is, the generator incorporates the moves of the East process behind the front plus shifts corresponding to whenever the front itself jumps forward/backward.
Remark.
The same graphical construction that was given for the East process applies to the process : this is clear for the East part of the generator ; for the shift part , simply apply a positive shift when there is a ring at the origin and the corresponding Bernoulli variable is one. If the Bernoulli variable is zero, operate a negative shift .
With this notation, the main result of Blondel [Blondel] can be summarized as follows.
Theorem 2.1 ([Blondel]).
The front of the East process, , and the process as seen from the front, , satisfy the following:

There exists a unique invariant measure for the process . Moreover, decreases exponentially fast in .

Let and let . Then and for any ,
Thus, if the East process has a front at time then it will have a front at any later time. The latter progresses in time with an asymptotically constant speed .
2.3. Local relaxation to equilibrium
In this section we review the main technical results on the local convergence to the stationary measure for the (infinite volume) East process. The key message here is that each vacancy in the starting configuration, in a time lag , induces the law in an interval in front of its position of length proportional to . That explains why the distance between the invariant measure and deteriorates when we approach the front from behind.
Definition 2.2.
Given a configuration and an interval we say that satisfies the Strong Spacing Condition (SSC) in if the largest subinterval of where is identically equal to one has length at most . Similarly, given , we will say that satisfies the Weak Spacing Condition (WSC) in if the largest subinterval of where is identically equal to one has length at most .
For brevity, we will omit the dependence in WSC case when these are made clear from the context.
Proposition 2.3.
There exist universal positive constants independent of such that the following holds. Let and let be such that . Further let be largest between the maximal spacing between two consecutive zeros of in and the distance of the last zero of from the vertex . Then
To prove this proposition, we need the following lemma.
Lemma 2.4.
There exist universal positive constants independent of such that the following holds. Fix with , let and let with . Let also denote the new function obtained by averaging w.r.t. the marginal of over the spin at . Then,
(2.3) 
Remark.
If we replaced the r.h.s. of (2.3) with , then the statement would coincide with that in [Blondel]*Proposition 4.3. Notice that as , the term does not blow up— unlike —and as remarked below (2.1), stays bounded away from . Hence, as , the time after which the r.h.s. in (2.3) becomes small is bounded from above by for some universal not depending on . This fact will be crucially used in the proofs of some of the theorems to follow.
Proof of Lemma 2.4.
As mentioned in the remark using [Blondel]*Proposition 4.3 it suffices to assume that . Fix as in the lemma and let be the set of all configurations which coincides with on the half line . The special configuration in which is identically equal to one in the interval will be denoted by . Observe that, using reversibility together with the fact that the updates in do not check the spins to the right of the origin,
(2.4) 
Using the graphical construction as a grand coupling for the processes with initial condition in , it is easy to verify that, at the hitting time of the set for the process started from , the processes starting from all possible initial conditions in have coupled. Let be distributed according to Then using the grand coupling,
The first equality follows by adding and subtracting from the l.h.s. and then using (2.3). The rest of the inequalities are immediate from the above discussion. In order to bound the above probability, we observe that the front , initially at , can be coupled to an asymmetric random walk , with as jump rate to the right(resp. left), in such a way that for all . Since we have assumed that by standard hitting time estimates for biased random walk there exist universal constants such that, for , the above probability is smaller than . ∎
Proof of Proposition 2.3.
Let be such that . Then
where we applied the above lemma to the shifted configuration in which
the origin coincides with the rightmost zero in of .
We now
observe that the new function depends only on the first
coordinates of and that . Thus we can iterate the above bound times to get that
Corollary 2.5.
Fix , and let . Then
(2.5)  
(2.6)  
(2.7) 
Proof.
By construction, for any . Thus the first statement follows at once from Proposition 2.3. The other two statements follow from the fact that
and
2.4. Finite speed of information propagation
As the East process is an interacting particle system whose rates are bounded by one, it is well known that in this case information can only travel through the system at finite speed. A quantitative statement of the above general fact goes as follows.
Lemma 2.6.
For and , define the “linking event” as the event that there exists a ordered sequence or of rings of the Poisson clocks associated to the corresponding sites in . Then there exists a constant such that, for all ,
Proof.
The probability of is equal to the probability that a Poisson process of intensity has at least instances within time . ∎
Remark 2.7.
An important consequence of the above lemma is the following fact. Let and let be the algebra generated by all the rings of the Poisson clocks and all the coin tosses up to time in the graphical construction of the East process. Fix and let be two events depending on and respectively. Then
This is because: (i) on the event the occurrence of the event does not depend anymore on the Poisson rings and coin tosses to the left of ; (ii) the occurrence of the event depends only on the Poisson rings and coin tosses to the left of because of the oriented character of the East process.
The finite speed of information propagation, together with the results of [AD02], implies the following rough bound on the position of the front for the East process started from (also see, e.g., [Blondel]*Lemma 3.2).
Lemma 2.8.
There exists constants and such that
Remark 2.9.
When one can obtain the above statement with and uniformly bounded away from by using our Proposition 2.3 instead of [Blondel]*Proposition 4.3 in the proof of [Blondel]*Lemma 3.2.
The second consequence of the finite speed of information propagation is a kind of mixing result behind the front for the process started from . We first need few additional notation.
Definition 2.10.
For any , we define the shifted configuration by
Proposition 2.11.
Let and let . Assume . Then for any and any the following holds:
To see what the proposition roughly tells we first assume that the front at time is at . Then the above result says that at a later time any event supported on is almost independent of the location of the front.
Proof.
Recall the definition of the event from Lemma 2.6 and let
We now write
We first note that given for any ,
and hence
Thus, we may assume that . Now
because under the assumption that , the two events are functions of an independent set of variables in the graphical construction (cf. Remark 2.7). By Lemma 2.6 we know that and the proof is complete. ∎
3. The law behind the front of the East process
Our main result in this section is a quantitative estimate on the rate of convergence as of the law of the process seen from the front to its invariant measure . Consider the process seen from the front (recalling §2.2) and let be its law at time when the starting configuration is .
Theorem 3.1.
For any there exist and such that
Moreover, and can be chosen uniformly as .
A corollary of this result — which will be key in the proof of Theorem 1 — is to show that, for any , the increments in the position of the front (the variables below) behave asymptotically as a stationary sequence of weakly dependent random variables with exponential moments.
Corollary 3.2.
To prove Theorem 3.1 we will require a technical result, Theorem 3.3 below, which can informally be summarized as follows:

Starting from , at any fixed large time , with high probability the configuration satisfies WSC apart from an interval behind the front of length proportional to .

If the above property is true at time , then at a later time the law of the process will be very close to apart from a small interval behind the front where the strong spacing property will occur with high probability.
Formally, fix a constant to be chosen later on and . Let , where appears in the WSC and let . Let denotes the set of configurations which fail to satisfy SSC in the interval and let be those configurations which fail to satisfy WSC in the interval .
Theorem 3.3.
It is possible to choose small enough and large enough depending only on in such a way that for all large enough the following holds:
(3.7)  
(3.8)  
(3.9) 
Moreover, stays bounded as
3.1. Nonequilibrium properties of the law behind the front: Proof of Theorem 3.3
We begin by proving (3.7). Bounding from above is equivalent to bounding from above, where denotes the set of configurations which do not satisfy the spacing condition in .
Using Lemma 2.8, with probability greater than we can assume that . Next we observe that, for any , the events and imply that there exists with the following properties:

;

The hitting time is smaller than ;

is identically equal to one in the interval ;

The linking event defined in Lemma