Dimension of Gibbs measures with infinite entropy

# Dimension of Gibbs measures with infinite entropy

Felipe Pérez Felipe Pérez: School of Mathematics, University of Bristol, University Walk, Clifton, Bristol, BS8 1TW, UK
July 30, 2019
###### Abstract.

We study the Hausdorff dimension of Gibbs measures with infinite entropy with respect to maps of the interval with countably many branches. We show that under simple conditions, such measures are symbolic-exact dimensional, and provide an almost sure value for the symbolic dimension. We also show that the lower local dimension dimension is almost surely equal to zero, while the upper local dimension is almost surely equal to the symbolic dimension. In particular, we prove that a large class of Gibbs measures with infinite entropy for the Gauss map have Hausdorff dimension zero and packing dimension equal to , and so such measures are not exact dimensional.

## 1. Introduction

In this paper we study the dimension of measures invariant under a certain class of maps of the unit interval : Expanding Markov Renyi (EMR) maps. These maps admit representations by means of symbolic dynamics, and satisfy smoothness properties that allow us to use ergodic theoretic methods to study their geometric properties. Given an ergodic T-invariant probability measure , we are interested in the pointwise behavior of the local dimension

 d(x)=limr→0logμ(B(x,r))logr.

Knowledge of the almost sure behavior of the local dimension yields information about the Hausdorff and the packing dimension of the measure. There are two dynamical quantities which are particularly relevant when studying the local dimension of such measures: the metric entropy (or simply the entropy) and the Lyapunov exponent of . The connection between the entropy and the Lyapunov exponent and the local dimension is well understood when the entropy is finite. Our goal is to investigate the case when both of these quantities are infinite.

Formulae relating the dynamical invariants and the local dimension have been extensively studied for the last few decades in the case . For Bernoulli measures invariant under the Gauss map, Kinney and Pitcher proved in [KP66] that if the measure is defined by a probability vector , the Hausdorff dimension of can be computed with the formula

 dimHμ=−∑∞n=1pnlogpn2∫10|logx|dμ(x)

provided that . In [LM85] the authors proved that for a map where and are piecewise monotonic and the Lyapunov exponent is positive, if is an invariant ergodic probability measure, then we have that

 limr→0logμ(B(x,r))logr=hμλμ.

In particular, . Other versions of the formula were proved by Young and Hofbauer, Raith in [You82] and [HR92], among others. In all of these examples, it is assumed . In the context of countable Markov systems, Mauldin and Urbanski proved the following theorem:

###### Theorem 1.1 (Volume Lemma, [Mu03]).

Let be a countable Markov shift coded by the shift in countably many symbols . Suppose that is a Borel shift-invariant ergodic probability measure on such that at least one of the numbers or is finite, where is the entropy of with respect to the natural partition in cylinders of . Then

 dimH(μ∘π−1)=hμλμ,

where is the coding map.

The coding map can be interpreted as a means to go from the symbolic representation of the dynamics to the geometric space. When the local dimension exists and is constant almost everywhere, we say that the measure is exact dimensional.

The case when was studied by Ledrappier and Misiurewicz in [LM85], wherein they constructed a map of the interval and a non-atomic ergodic invariant measure which has zero Lyapunov exponent and is such that the local dimension does not exist almost everywhere. More precisely, they show that the lower local dimension and upper local dimension are not equal:

 d––μ(x)liminfr→0logμ(B(x,r))logr

almost everywhere. For this construction, the authors consider a class of unimodal maps (Feigenbaum’s maps).

We investigate the Hausdorff dimensions of invariant ergodic measures for piecewise expanding maps of the interval with countably many branches. In particular, we focus on maps exhibiting similar properties to the Gauss map and measures with infinite entropy and infinite Lyapunov exponent. Our main result is (see next section for the definitions):

###### Theorem.

Let be a Gauss-like map and be an infinite entropy Gibbs measure satisfying assumption 1 and such that the decay ratio exists . Then almost everywhere.

We can also compute the almost sure value of the symbolic dimension. The Gibbs assumption on the measure implies that a certain sequence of observables can be seen as a non-integrable stationary ergodic process and allows us to use some tools of infinite ergodic theory developed by Aaronson. In particular, the pointwise behavior of trimmed sums plays a fundamental role in our arguments. We also prove that the packing dimension of such measures is equal to the decay ratio, and conclude that such systems are not exact dimensional. We remark that the methods used in the context of finite entropy fail, as they rely on the fact that the measure and diameter of the iterates of the natural Markov partition decrease at an exponential rate given by and respectively, enabling the use of coverings by balls of different scales. To tackle this problem, we make use of more refined coverings of balls, which are capable of detecting the asymptotic interaction between the Gibbs measure and the Lebesgue measure.

The study of the Hausdorff dimension of sets for which their points have infinite Lyapunov exponent has already been considered: see for instance [FLM10] where the authors compute the Hausdorff dimension of sets with prescribed digits on their continued fraction expansion, or [FSU14] where the authors construct a measure invariant under the Gauss map which gives full measure to the Liouville numbers. Since the Liouville numbers are a zero dimensional set, such measure is also zero dimensional. Our result shows that this is the case for a large class of measures.

The dimension of Bernoulli measures for the Gauss map was studied by Kifer, Peres and Weiss in [KPW01], where they show that there is a universal constant so that

 dimH(μp∘π−1)≤1−ε0

for every Bernoulli measure on the symbolic space coding the Gauss map, where is the coding map. This inequality holds even for the case where the entropy of the measure is infinite. They also show that for an infinite entropy Bernoulli measure , the Hausdorff dimension satisfies . Their method relies on estimating the dimension of the sets of points for which the frequency of a sequence of digits in their continued fraction expansion differs from the expected value by a certain threshold is uniformly (with respect to the sequence of digits) bounded from 1, and a bound on the dimension of points that lie in unusually short cylinders. This situation has been recently studied by Jurga and Baker (see [Jur18] and [BJ18]) using different methods. Concretly, in [Jur18] the author uses ideas of the Hilbert-Birkhoff cone theory and extract information about the dynamics through the transfer operator. On the other hand, in [BJ18]) the authors construct a Bernoulli measure such that , where the supremum is taken over all Bernoulli measures. This in conjunction with the Variational Principle (see [Wal82]) yield their result.

The paper is structured as follows. In section 2 we introduce the notation used throughout the paper as well as the main objects of study. We also state the results of the paper. In section 3 we compute the symbolic dimension and characterize it in terms of the Markov partition. In section 4 we study the consequences of at the level of the asymptotic rate of contraction of the cylinders. In sections 5 and 6 we prove the results for the Hausdorff and the Packing dimension respectively. We finish the article stating some questions of interest that could not be answered with the methods used in this paper.

## 2. Notation and statement of main results

### 2.1. The class of maps

We start introducing the EMR (Expanding-Markov-Renyi) maps of the interval.

###### Definition 2.1.

We say that a map of the interval is an EMR map if there is a countable collection of closed intervals (with disjoint interiors ) such that:

1. The map is on ,

2. Some power of is uniformly expanding, i.e., there is a positive integer and a constant such that for all ,

3. The map is Markov and can be coded by a full shift (see next subsection),

4. The map satisfies Renyi’s condition: there is a constant such that

 supn∈Nsupx,y,z∈In|T′′(x)||T′(y)||T′(z)|≤E,

This class of maps was first introduced in [PW99] in the context the multifractal analysis of the Lyapunov exponent for the Gauss map. Renyi’s condition provides good estimates for the Lebesgue measure of the cylinders associated to the Markov structure of the map (see next subsection). For simplicity, we will assume that the maps are orientation preserving (the orientation reversing case only differs in the relative position of the cylinders). The set of branches must accumulate at least at one point, and we assume that it accumulates at exactly one point: we also assume that the branches accumulate on the left endpoint of (the case when the branches accumulate in the right endpoint of is analogous). Re-indexing if necessary, we can assume that for all . Let .

###### Definition 2.2.

We say that an EMR map is a Gauss-like map if it satisfies the following conditions:

1. for every ,

2. ,

3. ,

4. for some constants ,

5. decays polynomially as goes to infinity (see definition (3.7)).

We want to keep in mind piecewise linear functions as the main example, as for this class of maps, calculations are simplified. We will also keep in mind the example of the Gauss map.

### 2.2. Markov structure and symbolic coding

We describe now the Markov structure of the maps considered. Given a finite sequence of natural numbers , the n-th level cylinder associated to is the set . Let , then given and , there exists a unique sequence such that for every . We denote this sequence by by when is clear from the context. We also denote and we say is coded by the sequence

Let and be the full shift over . Then the cylinders in the symbolic space are defined by

 C(a1,a2,…,an)={(xn)∈Σ∣xj=aj for j=1,…,n}}.

We endow the space with the topology generated by the cylinders defined above. Then the map given by is a continuous bijection.

Given with coding sequence and , denote by (resp if ) the level cylinder on the left (resp right) of . Also, denote by . If there is no risk of confusion, we omit the dependence on .

Renyi’s condition introduced in the previous subsection implies that the length of each cylinder is comparable to the derivative of the iterates of the map at any point of the cylinder. More precisely,

 0

for every finite sequence and .

### 2.3. The class of measures

We start by giving the usual definition of Gibbs measures:

###### Definition 2.3.

Let be an invariant measure with respect to . Then we say that is a Gibbs measure associated to the potential , that is, there exist constants so that

 A≤μ(C(a1,…,an))exp(−nP(logφ)+Sn(logφ)(x))≤B,

where is any point in , is any sequence in , is the Birkhoff sum of at the point , and is a constant (depending on the potential) called the topological pressure of .

Throughout this work we will assume that , otherwise we can take the zero pressure potential . It is important to note that it is not trivial that this will not affect our computations, and we will show later how we can overcome that difficulty. The sequence will be of particular relevance for our computations.

We can project this measure to by setting . We assume these measures are invariant and ergodic with respect to . We will denote by both the measure in the symbolic space and the projected measure.

Our main assumption on the class of measures is that they have infinite entropy. This can be expressed by saying that the potential is not integrable with respect to . In fact, by the Gibbs property, the Shannon-McMillan-Breiman entropy can be written as

 hμ=−limn→∞1nlogμ(C(a1,…,an))=limn→∞1nSn(−logφ)(x)=∞

for almost every if the integral of is infinite. The last equality is a consequence of Lemma 3.2.

We define the n-th variation of the potential by

 varn(logφ)=sup{|logφ(x)−logφ(y)|∣x,y∈I(a1,…,an),(a1,…,an)∈Nn}.
###### Definition 2.4.

Let be the unique fixed point of in . We define then the decay ratio by

 s=limn→∞logφ(xn)logrn=limn→∞logpnlogrn.

The tail decay ratio is defined by

 ^s=limn→∞log∑m≥nφ(xm)log∑m≥nrm=limn→∞log∑m≥npmlog∑m≥nrm.

Both definitions for and agree since is a Gibbs measure. Note also that the definitions above are independent of the choice of the point representing each cylinder if . By Cersàro-Stolz theorem we can write the decay ratio as

 s=limn→∞∑nk=1pnlogpn∑nk=1pnlogrn.
###### Assumption 1.

Assume that . For the sequence sequence we assume that for every , we have

 0

for some constants .

The second condition prevents the existence of large jumps for the potential along sufficiently sparse subsequences of . By the Gibbs property, the properties hold if we replace by .

### 2.4. Entropy and Lyapunov exponent

Since our measures are Gibbs and the potential has finite first variation, we can write the entropy of the system simply as

 hμ=−∞∑n=1qnlogqn=−∞∑n=1pnlogpn

We define the Lyapunov exponent as

 λμ=∫10log|T′(x)|dμ(x).

By the bounded distortion property, we can write the Lyapunov exponent as

 λμ=−∞∑n=1qnlogrn+L.

where is a distortion constant (independent of ). Thus, is infinite if and only if the series above is divergent. Throughout this work, we assume that both numbers and are infinite, and hence we can think of as defined by the series above.

### 2.5. Hausdorff and packing dimension

In this section we introduce the dimension theory elements we will study throughout this work. Recall the diameter of a set is given by

 |U|=sup{|x−y|∣x,y∈U}.

For a cover of a set , its diameter is given by

 diamU=sup{|U|∣U∈U}.
###### Definition 2.5.

Given and , the dimensional Hausdorff measure of is given by

 m(X,α)=limδ→0infU∑U∈U|U|α,

where the infimum is taken over finite or countable covers of with .

It is possible to prove that there exists a number such that for and for , since is decreasing in for a fixed set .

###### Definition 2.6.

The unique number

 dimHX=inf{α∈[0,∞]∣m(X,α)=0}

is called the Hausdorff dimension of .

We extend the notion of Hausdorff dimension to finite Borel measures on :

###### Definition 2.7.

Let be a finite Borel measure on . The Hausdorff dimension of is defined by

 dimHμ=inf{dimH(Z)∣μ(R∖Z)=0}.

We define now the analogue notion of Packing dimension

###### Definition 2.8.

We say that a collection of balls is a packing of the set if the diameter of the balls is less than or equal to , they are pairwise disjoint and their centres belong to . For , the dimensional pre-packing measure of is given by

 P(E,α)=limδ→0sup{∑ndiam(Un)α}

where the supremum is taken over all packings of . The dimensional packing measure of is defined by

 p(E,α)=inf{∑iP(Ei,α)}

where the infimum is taken over all covers of . Finally, we define the packing dimension of by

 dimP(E)=sup{s∣p(E,α)=∞}=inf{s∣p(E,α)=0}.

We extend the notion of packing dimension to finite Borel measures on .

###### Definition 2.9.

Let be a finite Borel measure on . The Packing dimension of is defined by

 dimPμ=inf{dimP(Z)∣μ(R∖Z)=0}.

It is important to remark that the definitions of dimension for measures is not standard. For instance, a different definition often used is given by

 dim––––Hμ =inf{dimH(Z)∣μ(Z)>0}, dim––––Pμ =inf{dimP(Z)∣μ(Z)>0}.

We refer to these as lower Hausdorff (packing) dimensions.

Bounding the Hausdorff dimension from above or the Packing dimension from below usually involves the use of a single suitable cover of the space, while for bounds from below and above respectively, we have to deal with every cover of the space. There are several tools to help with this problem, and we will make use of the so called (local) Mass Distribution Principles. For this, we introduce the notion of local dimension.

###### Definition 2.10.

The lower and upper pointwise dimensions of the measure at a point is given by

 d––μ(x)=liminfr→0logμ(B(x,r))logr , ¯¯¯dμ(x)=limsupr→0logμ(B(x,r))logr.

When both limits coincide, we call the common value the pointwise dimension of at and denote it by and say that is exact dimensional if almost everywhere.

If , then for small values of . We state now the local version of the Mass Distribution Principle.

###### Proposition 2.11.

Let and , then

1. If for almost every , then ;

2. If for every , then ,

3. If for almost every , then ;

4. If for every , then ,

5. We have

 dimHμ =esssup{d––μ(x)∣x∈X}, dimPμ =esssup{¯¯¯dμ(x)∣x∈X},
###### Proof.

This follows from Proposition 2.3 of [Fal97]. ∎

In particular, if is constant almost everywhere, then is equal to that constant value. Analogously, if is constant almost everywhere, then is equal to that constant value.

A notion of dimension which is more adapted to the underlying structure of our dynamical system is the symbolic dimension, which we proceed to define.

###### Definition 2.12.

Given , we define the lower symbolic dimension of at by

 δ–(x)=liminfn→∞logμ(In(x))log|In(x)|,

and the upper symbolic dimension of at by

 ¯¯¯δ(x)=limsupn→∞logμ(In(x))log|In(x)|,

If , then we define the symbolic dimension of at as the common value, denote it by , and we say that is symbolic exact dimensional if .

### 2.6. Main results

The estimates we prove depend on asymptotic relations between the measure and the length of cylinders defining the system. The main results are then:

###### Theorem 2.13.

Let be an EMR map, and be an infinite entropy Gibbs measure satisfying assumption 1. If the decay ratio exists and it is equal to , then

 δ–(x)=¯¯¯δ(x)=s μ a.e.

If we assume that the decay of is polynomial and the measure satisfies the regularity conditions given by assumption 1, we can compute the local dimensions:

###### Theorem 2.14.

Let be a Gauss-like map, and be an infinite entropy Gibbs measure satisfying assumption 1. If the decay ratio exists and it is equal to , then

1. ,

2. ,

Consequently, .

## 3. Symbolic dimension

### 3.1. Computation of the symbolic dimension

We prove now that under the above assumptions, the Gibbs measure is symbolic exact dimensional, and this dimension coincides with the decay ratio. This result does not depend on the length decaying ratio of the partition of the interval.

In general the Lyapunov exponent majorizes the entropy. In a more general setting, this result is known as Ruelle’s inequality (see [Rue78]).

If then .

###### Proof.

This is an immediate consequence of the Volume Lemma (theorem 1.1): if , then which is impossible. ∎

We prove a well known fact about non-integrable observables.

###### Lemma 3.2.

Let be a bounded below measurable function such that . Then

 limn→∞1nn−1∑k=0f∘Tk=∞

for almost every point.

###### Proof.

The proof is an standard application of the Monotone Convergence Theorem. Assume is positive (otherwise, decompose into its positive and negative part) and let . Then

 liminfn→∞1nn−1∑k=0f∘Tk(x) ≥limn→∞1nn−1∑k=0min{f∘Tk,M}(x) =∫10min{f,M}(x)dμ(x)

by Birkhoff’s Ergodic Theorem applied to . By the Monotone Convergence Theorem,

 limn→∞∫10min{f,M}(x)dμ(x)=∫10fdμ(x)=∞

from where we conclude the result. ∎

This result implies in particular that we can assume that the pressure of our potential is zero, as dominates when is not integrable.

We formulate a lemma regarding the metric and measure theoretic properties of the cylinders associated to the map. This will allow us to write geometric quantities in ergodic theoretic terms. Its proof is a standard applications of the bounded distortion and Gibbs properties.

###### Lemma 3.3.

For every finite sequence and , we have that

where are distortion constants and are constants arising from the Gibbs property.

We proceed to compute the symbolic dimension of our system. This result holds regardless of the decay rate of the sequence .

###### Theorem 3.4.

Let be an EMR map and a Gibbs measure with infinte entropy satisfying Assumption 1. Then if the decay ratio exists, we have that is symbolic-exact dimensional and for -almost every ,

 δ(x)=s.
###### Proof.

By Lemma 3.2 applied to the observables and and Lemma 3.3, we have

 δ–(x)≤liminfn→∞Sn(logφ)(x)−nD1−D2+Sn(logra1)(x)=liminfn→∞log(qa1…qan)log(ra1…ran), δ–(x)≥liminfn→∞Sn(logφ)(x)nD1+D2+Sn(logra1)(x)=liminfn→∞log(qa1…qan)log(ra1…ran),

and analogously for the upper symbolic dimension

 ¯¯¯δ(x)=limsupn→∞log(qa1…qan)log(ra1…ran)

where is the sequence coding . With a similar argument, we can also show that

 δ–(x)=liminfn→∞log(pa1…pan)log(ra1…ran),

and analogously for the upper symbolic dimension.

For and , define

 fn,k(x)=#{i∈{1,…,n}∣ki(x)=k},

that is, the number of times the orbit of visits the interval in the first steps. Recall that from the Birkhoff Theorem, we have that

 limn→∞fn,kn=pk

for almost every . In particular, the orbit of almost every visits every cylinder infinitely many times. Fix in the set where the convergence holds, and then define by . The previous remark shows that is unbounded, and it is clearly non-decreasing. Thus, we can write

 −log(rk1…rkn)=−n∑j=1logrkj=−m(n)∑j=1fn,jlogrj.

Given , there exists such that

 ∣∣∣logpklogrk−s∣∣∣<ϵ

for every , that is, for . For large enough so that , we write

 log(pk1…pkn)log(rk1…rkn)=n1∑k=1fn,k(−logpk)+m(n)∑k=n1+1fn,k(−logpk)n1∑k=1fn,k(−logrk)+m(n)∑k=n1+1fn,k(−logrk).

We analyse separately the two bits of the sum:

 S1(n) =n1∑k=1fn,k(−logpk)n1∑k=1fn,k(−logrk)+m(n)∑k=n1+1fn,k(−logrk), S2(n) =m(n)∑k=n1+1fn,k(−logpk)n1∑k=1fn,k(−logrk)+m(n)∑k=n1+1fn,k(−logrk).

For taking there exists such that

 npk2≤fn,k≤3npk2

for every . Thus, the terms and grow linearly in for large enough. We will show that grows faster than linear.

Given , since the Lyapunov exponent is infinite, there exists such that

 m(n)∑k=n1+1pk(−logrk)>2M

for every . Now, for , take and so there exists such that

 fn,k≥npk2

for every and . Thus

 1nm(n)∑k=n1+1fn,k(−logrk) =1nm(n4)∑k=n1+1fn,k(−logrk)+1nm(n)∑k=m(n4)+1fn,k(−logrk) ≥1nm(n4)∑k=n1+1npk2(−logrk) =12m(n4)∑k=n1+1pk(−logrk)>M

for every . This shows that as . To estimate , we note that

 S2(n)≤(s+ϵ)⋅m(n)∑k=n1+1fn,k(−logrk)n1∑k=1fn,k(−logrk)+m(n)∑k=n1+1fn,k(−logrk)

Using the same argument as above, we can show that grows faster than linear, so . This shows that

 δ(x)≤s

The proof of the opposite inequality is analogous. ∎

### 3.2. The decay ratio

Now we proceed to study the properties of the decay. In fact, we show that for infinite entropy measures, it is completely determined by the properties of the partition :

###### Definition 3.5.

The convergence exponent of the partition of is defined by

 s∞=inf{s≥0∣∞∑n=1rsn<∞}.
###### Proposition 3.6.

In general, we have that . Under the assumption that , we also have .

###### Proof.

Given , there exists such that

 (ϵ+s)logrn

for every , and thus for every . Summing over we get

 ∞∑n=1rs+ϵn=n1−1∑n=1rs+ϵn+∞∑n=n1rs+ϵn≤n1−1∑n=1rs+ϵn+∞∑n=n1pn<∞.

Hence, for every and so .

Now, Suppose that , and hence, there is such that and

 ∞∑n=1rs∞+αn<∞.

Let , then there is an integer such that

 rs+ϵn≤pn≤rs−ϵn

for all . This implies that

 ∞∑n=n0pn(−logpn)≤(s+ϵ)∞∑n=n0rs−ϵn(−logrn).

Recall the one sided limit criterion for convergence of series: let sequences such that

 limsupn→∞anbn=c∈[0,∞)

and . Then .

Let be the function defined by

 f(x)={0,for x=0,xϵ(−logx),for x>0.

It is easy to see that is continuous. Taking and and using the continuity of , we get that

 limsupn→∞anbn=limn→∞rϵn(−logrn)=0.

We conclude that

 ∞∑n=