Dimension of Gibbs measures with infinite entropy
Abstract.
We study the Hausdorff dimension of Gibbs measures with infinite entropy with respect to maps of the interval with countably many branches. We show that under simple conditions, such measures are symbolicexact dimensional, and provide an almost sure value for the symbolic dimension. We also show that the lower local dimension dimension is almost surely equal to zero, while the upper local dimension is almost surely equal to the symbolic dimension. In particular, we prove that a large class of Gibbs measures with infinite entropy for the Gauss map have Hausdorff dimension zero and packing dimension equal to , and so such measures are not exact dimensional.
1. Introduction
In this paper we study the dimension of measures invariant under a certain class of maps of the unit interval : Expanding Markov Renyi (EMR) maps. These maps admit representations by means of symbolic dynamics, and satisfy smoothness properties that allow us to use ergodic theoretic methods to study their geometric properties. Given an ergodic Tinvariant probability measure , we are interested in the pointwise behavior of the local dimension
Knowledge of the almost sure behavior of the local dimension yields information about the Hausdorff and the packing dimension of the measure. There are two dynamical quantities which are particularly relevant when studying the local dimension of such measures: the metric entropy (or simply the entropy) and the Lyapunov exponent of . The connection between the entropy and the Lyapunov exponent and the local dimension is well understood when the entropy is finite. Our goal is to investigate the case when both of these quantities are infinite.
Formulae relating the dynamical invariants and the local dimension have been extensively studied for the last few decades in the case . For Bernoulli measures invariant under the Gauss map, Kinney and Pitcher proved in [KP66] that if the measure is defined by a probability vector , the Hausdorff dimension of can be computed with the formula
provided that . In [LM85] the authors proved that for a map where and are piecewise monotonic and the Lyapunov exponent is positive, if is an invariant ergodic probability measure, then we have that
In particular, . Other versions of the formula were proved by Young and Hofbauer, Raith in [You82] and [HR92], among others. In all of these examples, it is assumed . In the context of countable Markov systems, Mauldin and Urbanski proved the following theorem:
Theorem 1.1 (Volume Lemma, [Mu03]).
Let be a countable Markov shift coded by the shift in countably many symbols . Suppose that is a Borel shiftinvariant ergodic probability measure on such that at least one of the numbers or is finite, where is the entropy of with respect to the natural partition in cylinders of . Then
where is the coding map.
The coding map can be interpreted as a means to go from the symbolic representation of the dynamics to the geometric space. When the local dimension exists and is constant almost everywhere, we say that the measure is exact dimensional.
The case when was studied by Ledrappier and Misiurewicz in [LM85], wherein they constructed a map of the interval and a nonatomic ergodic invariant measure which has zero Lyapunov exponent and is such that the local dimension does not exist almost everywhere. More precisely, they show that the lower local dimension and upper local dimension are not equal:
almost everywhere. For this construction, the authors consider a class of unimodal maps (Feigenbaum’s maps).
We investigate the Hausdorff dimensions of invariant ergodic measures for piecewise expanding maps of the interval with countably many branches. In particular, we focus on maps exhibiting similar properties to the Gauss map and measures with infinite entropy and infinite Lyapunov exponent. Our main result is (see next section for the definitions):
Theorem.
Let be a Gausslike map and be an infinite entropy Gibbs measure satisfying assumption 1 and such that the decay ratio exists . Then almost everywhere.
We can also compute the almost sure value of the symbolic dimension. The Gibbs assumption on the measure implies that a certain sequence of observables can be seen as a nonintegrable stationary ergodic process and allows us to use some tools of infinite ergodic theory developed by Aaronson. In particular, the pointwise behavior of trimmed sums plays a fundamental role in our arguments. We also prove that the packing dimension of such measures is equal to the decay ratio, and conclude that such systems are not exact dimensional. We remark that the methods used in the context of finite entropy fail, as they rely on the fact that the measure and diameter of the iterates of the natural Markov partition decrease at an exponential rate given by and respectively, enabling the use of coverings by balls of different scales. To tackle this problem, we make use of more refined coverings of balls, which are capable of detecting the asymptotic interaction between the Gibbs measure and the Lebesgue measure.
The study of the Hausdorff dimension of sets for which their points have infinite Lyapunov exponent has already been considered: see for instance [FLM10] where the authors compute the Hausdorff dimension of sets with prescribed digits on their continued fraction expansion, or [FSU14] where the authors construct a measure invariant under the Gauss map which gives full measure to the Liouville numbers. Since the Liouville numbers are a zero dimensional set, such measure is also zero dimensional. Our result shows that this is the case for a large class of measures.
The dimension of Bernoulli measures for the Gauss map was studied by Kifer, Peres and Weiss in [KPW01], where they show that there is a universal constant so that
for every Bernoulli measure on the symbolic space coding the Gauss map, where is the coding map. This inequality holds even for the case where the entropy of the measure is infinite. They also show that for an infinite entropy Bernoulli measure , the Hausdorff dimension satisfies . Their method relies on estimating the dimension of the sets of points for which the frequency of a sequence of digits in their continued fraction expansion differs from the expected value by a certain threshold is uniformly (with respect to the sequence of digits) bounded from 1, and a bound on the dimension of points that lie in unusually short cylinders. This situation has been recently studied by Jurga and Baker (see [Jur18] and [BJ18]) using different methods. Concretly, in [Jur18] the author uses ideas of the HilbertBirkhoff cone theory and extract information about the dynamics through the transfer operator. On the other hand, in [BJ18]) the authors construct a Bernoulli measure such that , where the supremum is taken over all Bernoulli measures. This in conjunction with the Variational Principle (see [Wal82]) yield their result.
The paper is structured as follows. In section 2 we introduce the notation used throughout the paper as well as the main objects of study. We also state the results of the paper. In section 3 we compute the symbolic dimension and characterize it in terms of the Markov partition. In section 4 we study the consequences of at the level of the asymptotic rate of contraction of the cylinders. In sections 5 and 6 we prove the results for the Hausdorff and the Packing dimension respectively. We finish the article stating some questions of interest that could not be answered with the methods used in this paper.
2. Notation and statement of main results
2.1. The class of maps
We start introducing the EMR (ExpandingMarkovRenyi) maps of the interval.
Definition 2.1.
We say that a map of the interval is an EMR map if there is a countable collection of closed intervals (with disjoint interiors ) such that:

The map is on ,

Some power of is uniformly expanding, i.e., there is a positive integer and a constant such that for all ,

The map is Markov and can be coded by a full shift (see next subsection),

The map satisfies Renyi’s condition: there is a constant such that
This class of maps was first introduced in [PW99] in the context the multifractal analysis of the Lyapunov exponent for the Gauss map. Renyi’s condition provides good estimates for the Lebesgue measure of the cylinders associated to the Markov structure of the map (see next subsection). For simplicity, we will assume that the maps are orientation preserving (the orientation reversing case only differs in the relative position of the cylinders). The set of branches must accumulate at least at one point, and we assume that it accumulates at exactly one point: we also assume that the branches accumulate on the left endpoint of (the case when the branches accumulate in the right endpoint of is analogous). Reindexing if necessary, we can assume that for all . Let .
Definition 2.2.
We say that an EMR map is a Gausslike map if it satisfies the following conditions:

for every ,

,

,

for some constants ,

decays polynomially as goes to infinity (see definition (3.7)).
We want to keep in mind piecewise linear functions as the main example, as for this class of maps, calculations are simplified. We will also keep in mind the example of the Gauss map.
2.2. Markov structure and symbolic coding
We describe now the Markov structure of the maps considered. Given a finite sequence of natural numbers , the nth level cylinder associated to is the set . Let , then given and , there exists a unique sequence such that for every . We denote this sequence by by when is clear from the context. We also denote and we say is coded by the sequence
Let and be the full shift over . Then the cylinders in the symbolic space are defined by
We endow the space with the topology generated by the cylinders defined above. Then the map given by is a continuous bijection.
Given with coding sequence and , denote by (resp if ) the level cylinder on the left (resp right) of . Also, denote by . If there is no risk of confusion, we omit the dependence on .
Renyi’s condition introduced in the previous subsection implies that the length of each cylinder is comparable to the derivative of the iterates of the map at any point of the cylinder. More precisely,
for every finite sequence and .
2.3. The class of measures
We start by giving the usual definition of Gibbs measures:
Definition 2.3.
Let be an invariant measure with respect to . Then we say that is a Gibbs measure associated to the potential , that is, there exist constants so that
where is any point in , is any sequence in , is the Birkhoff sum of at the point , and is a constant (depending on the potential) called the topological pressure of .
Throughout this work we will assume that , otherwise we can take the zero pressure potential . It is important to note that it is not trivial that this will not affect our computations, and we will show later how we can overcome that difficulty. The sequence will be of particular relevance for our computations.
We can project this measure to by setting . We assume these measures are invariant and ergodic with respect to . We will denote by both the measure in the symbolic space and the projected measure.
Our main assumption on the class of measures is that they have infinite entropy. This can be expressed by saying that the potential is not integrable with respect to . In fact, by the Gibbs property, the ShannonMcMillanBreiman entropy can be written as
for almost every if the integral of is infinite. The last equality is a consequence of Lemma 3.2.
We define the nth variation of the potential by
Definition 2.4.
Let be the unique fixed point of in . We define then the decay ratio by
The tail decay ratio is defined by
Both definitions for and agree since is a Gibbs measure. Note also that the definitions above are independent of the choice of the point representing each cylinder if . By CersàroStolz theorem we can write the decay ratio as
Assumption 1.
Assume that . For the sequence sequence we assume that for every , we have
for some constants .
The second condition prevents the existence of large jumps for the potential along sufficiently sparse subsequences of . By the Gibbs property, the properties hold if we replace by .
2.4. Entropy and Lyapunov exponent
Since our measures are Gibbs and the potential has finite first variation, we can write the entropy of the system simply as
We define the Lyapunov exponent as
By the bounded distortion property, we can write the Lyapunov exponent as
where is a distortion constant (independent of ). Thus, is infinite if and only if the series above is divergent. Throughout this work, we assume that both numbers and are infinite, and hence we can think of as defined by the series above.
2.5. Hausdorff and packing dimension
In this section we introduce the dimension theory elements we will study throughout this work. Recall the diameter of a set is given by
For a cover of a set , its diameter is given by
Definition 2.5.
Given and , the dimensional Hausdorff measure of is given by
where the infimum is taken over finite or countable covers of with .
It is possible to prove that there exists a number such that for and for , since is decreasing in for a fixed set .
Definition 2.6.
The unique number
is called the Hausdorff dimension of .
We extend the notion of Hausdorff dimension to finite Borel measures on :
Definition 2.7.
Let be a finite Borel measure on . The Hausdorff dimension of is defined by
We define now the analogue notion of Packing dimension
Definition 2.8.
We say that a collection of balls is a packing of the set if the diameter of the balls is less than or equal to , they are pairwise disjoint and their centres belong to . For , the dimensional prepacking measure of is given by
where the supremum is taken over all packings of . The dimensional packing measure of is defined by
where the infimum is taken over all covers of . Finally, we define the packing dimension of by
We extend the notion of packing dimension to finite Borel measures on .
Definition 2.9.
Let be a finite Borel measure on . The Packing dimension of is defined by
It is important to remark that the definitions of dimension for measures is not standard. For instance, a different definition often used is given by
We refer to these as lower Hausdorff (packing) dimensions.
Bounding the Hausdorff dimension from above or the Packing dimension from below usually involves the use of a single suitable cover of the space, while for bounds from below and above respectively, we have to deal with every cover of the space. There are several tools to help with this problem, and we will make use of the so called (local) Mass Distribution Principles. For this, we introduce the notion of local dimension.
Definition 2.10.
The lower and upper pointwise dimensions of the measure at a point is given by
When both limits coincide, we call the common value the pointwise dimension of at and denote it by and say that is exact dimensional if almost everywhere.
If , then for small values of . We state now the local version of the Mass Distribution Principle.
Proposition 2.11.
Let and , then

If for almost every , then ;

If for every , then ,

If for almost every , then ;

If for every , then ,

We have
Proof.
This follows from Proposition 2.3 of [Fal97]. ∎
In particular, if is constant almost everywhere, then is equal to that constant value. Analogously, if is constant almost everywhere, then is equal to that constant value.
A notion of dimension which is more adapted to the underlying structure of our dynamical system is the symbolic dimension, which we proceed to define.
Definition 2.12.
Given , we define the lower symbolic dimension of at by
and the upper symbolic dimension of at by
If , then we define the symbolic dimension of at as the common value, denote it by , and we say that is symbolic exact dimensional if .
2.6. Main results
The estimates we prove depend on asymptotic relations between the measure and the length of cylinders defining the system. The main results are then:
Theorem 2.13.
Let be an EMR map, and be an infinite entropy Gibbs measure satisfying assumption 1. If the decay ratio exists and it is equal to , then
If we assume that the decay of is polynomial and the measure satisfies the regularity conditions given by assumption 1, we can compute the local dimensions:
Theorem 2.14.
Let be a Gausslike map, and be an infinite entropy Gibbs measure satisfying assumption 1. If the decay ratio exists and it is equal to , then

,

,
Consequently, .
3. Symbolic dimension
3.1. Computation of the symbolic dimension
We prove now that under the above assumptions, the Gibbs measure is symbolic exact dimensional, and this dimension coincides with the decay ratio. This result does not depend on the length decaying ratio of the partition of the interval.
In general the Lyapunov exponent majorizes the entropy. In a more general setting, this result is known as Ruelle’s inequality (see [Rue78]).
Proposition 3.1.
If then .
Proof.
This is an immediate consequence of the Volume Lemma (theorem 1.1): if , then which is impossible. ∎
We prove a well known fact about nonintegrable observables.
Lemma 3.2.
Let be a bounded below measurable function such that . Then
for almost every point.
Proof.
The proof is an standard application of the Monotone Convergence Theorem. Assume is positive (otherwise, decompose into its positive and negative part) and let . Then
by Birkhoff’s Ergodic Theorem applied to . By the Monotone Convergence Theorem,
from where we conclude the result. ∎
This result implies in particular that we can assume that the pressure of our potential is zero, as dominates when is not integrable.
We formulate a lemma regarding the metric and measure theoretic properties of the cylinders associated to the map. This will allow us to write geometric quantities in ergodic theoretic terms. Its proof is a standard applications of the bounded distortion and Gibbs properties.
Lemma 3.3.
For every finite sequence and , we have that
where are distortion constants and are constants arising from the Gibbs property.
We proceed to compute the symbolic dimension of our system. This result holds regardless of the decay rate of the sequence .
Theorem 3.4.
Let be an EMR map and a Gibbs measure with infinte entropy satisfying Assumption 1. Then if the decay ratio exists, we have that is symbolicexact dimensional and for almost every ,
Proof.
By Lemma 3.2 applied to the observables and and Lemma 3.3, we have
and analogously for the upper symbolic dimension
where is the sequence coding . With a similar argument, we can also show that
and analogously for the upper symbolic dimension.
For and , define
that is, the number of times the orbit of visits the interval in the first steps. Recall that from the Birkhoff Theorem, we have that
for almost every . In particular, the orbit of almost every visits every cylinder infinitely many times. Fix in the set where the convergence holds, and then define by . The previous remark shows that is unbounded, and it is clearly nondecreasing. Thus, we can write
Given , there exists such that
for every , that is, for . For large enough so that , we write
We analyse separately the two bits of the sum:
For taking there exists such that
for every . Thus, the terms and grow linearly in for large enough. We will show that grows faster than linear.
Given , since the Lyapunov exponent is infinite, there exists such that
for every . Now, for , take and so there exists such that
for every and . Thus
for every . This shows that as . To estimate , we note that
Using the same argument as above, we can show that grows faster than linear, so . This shows that
The proof of the opposite inequality is analogous. ∎
3.2. The decay ratio
Now we proceed to study the properties of the decay. In fact, we show that for infinite entropy measures, it is completely determined by the properties of the partition :
Definition 3.5.
The convergence exponent of the partition of is defined by
Proposition 3.6.
In general, we have that . Under the assumption that , we also have .
Proof.
Given , there exists such that
for every , and thus for every . Summing over we get
Hence, for every and so .
Now, Suppose that , and hence, there is such that and
Let , then there is an integer such that
for all . This implies that
Recall the one sided limit criterion for convergence of series: let sequences such that
and . Then .
Let be the function defined by
It is easy to see that is continuous. Taking and and using the continuity of , we get that
We conclude that