1 Introduction

Zero Krengel Entropy does not kill Poisson Entropy

Abstract.

We prove that the notions of Krengel entropy and Poisson entropy for infinite-measure-preserving transformations do not always coincide: We construct a conservative infinite-measure-preserving transformation with zero Krengel entropy (the induced transformation on a set of measure 1 is the Von Neumann-Kakutani odometer), but whose associated Poisson suspension has positive entropy.

1. Introduction

1.1. Entropy for infinite-measure-preserving transformations

There exist several notions of entropy for infinite transformations, which elegantly generalize Kolmogorov’s entropy of a probability-preserving transformation. Krengel [4] comes down to the finite-measure case by considering the entropy of the induced transformation on a set of finite measure: The Krengel entropy of a conservative measure-preserving transformation is defined as

where is the collection of sets in with finite positive measure, is the normalized probability measure on obtained by restricting to , and is the induced map on . Recall that this map is defined by

where is the first-return-time map associated to . As soon as is not purely periodic, Krengel proved that

where is any finite-measure sweep-out set (i.e. a set such that ).

The Parry entropy of an infinite-measure-preserving transformation has been defined in [7] as the supremum of the conditional entropy of with respect to , for all -finite sub--algebras such that .

Recall now that to each infinite-measure-preserving transformation we can associate a probability-preserving transformation called its Poisson suspension, and which can be described as follows (we refer to [8] for details): We consider a Poisson process on with intensity , which we can consider as a random collection of particles. These particles are distributed over in such a way that, denoting by the random number of particles in any finite-measure set , for any finite collection of pairwise disjoint, finite-measure sets , the random variables are independent, and follow Poisson distributions with respective parameters . Then is defined on the canonical space of this Poisson process, and it consists in moving individually each of these particle according to the transformation on . The Poisson entropy of an infinite-measure-preserving transformation was defined by Roy [8] as the Kolmogorov entropy of its Poisson suspension.

Relations between these notions of entropy are studied in [3]: On large classes of transformations (e.g. quasi-finite transformations, rank-one transformations), it is proved that Poisson entropy is equal to Krengel entropy and to Parry entropy. Moreover, in any case, Parry entropy is dominated by both Krengel and Poisson entropy.

It was asked in [3] whether, for any conservative measure-preserving transformation, these three definitions always coincide. The purpose of the present paper is to show that the answer is negative, by constructing a counterexample.

Theorem 1.1.

There exists a conservative infinite-measure-preserving transformation with zero Krengel entropy (hence zero Parry entropy), but whose associated Poisson suspension has positive entropy.

2. Construction

2.1. Von Neumann-Kakutani odometer

The transformation is constructed as a tower over the Von Neumann-Kakutani odometer . Let us recall the construction of the latter by cutting and stacking (see Figure 1). We start with the interval . The first step consists in cutting into two sub-intervals and of measure , and stacking over . We get a tower of height 2 which we call Tower 1. After step , has been cut into sub-intervals which are stacked to get Tower . This means that at this step each point of (except those lying on the top of the tower) is mapped by to the point of lying above it. We construct Tower by cutting Tower into two equal parts. We call the left half of the top interval of Tower and we stack the right part of the tower over , thus dividing by 2 the measure of the set where is not yet defined. Repeating this procedure defines the Von Neumann-Kakutani odometer which preserves the Lebesgue measure on . It is well known that the odometer is ergodic and has zero entropy.

101/2Tower Tower
Figure 1. First steps in the construction of the Von Neumann-Kakutani odometer by cutting and stacking.

2.2. Construction of

is constructed on in such a way that the induced transformation coincides with the odometer previously defined. is completely defined (up to isomorphism) by giving for each point the first return time to .

We fix an increasing sequence of integers , with . For any , we choose a large enough integer (to be precised later). We define the first return time to so that its restriction to is uniformly distributed on for any .

We will see in section 4 that by choosing large enough, the entropy of is positive.

3. Poisson approximation lemma

The purpose of this section is to prove the key lemma. This lemma roughly states that when is large enough, it is almost impossible in the Poisson suspension to keep track individually of the particles when they leave if we only have access to the number of particles in . For this, we compare two processes: The first one is simply an i.i.d. sequence of Poisson random variables, whereas the second one modelizes particles leaving and coming back to . The comparison between the two processes uses the notion of -distance, of which we recall some properties.

3.1. The -distance

The -distance between two stationary processes has been introduced by Ornstein for the proof of the isomorphism theorem of Bernoulli shifts. We refer to [6] or [9] for the properties of this distance which we use later and which we recall here.

Let and be two stationary processes taking values in a countable alphabet . For any integers , we denote by the finite sequence .

For , let be the set of all joinings of and , that is probability distributions on whose marginals are the distributions of and . We first define for any , by

where is the Hamming distance between sequences of length :

Then, the -distance between and is defined by

It can be shown that is also the minimum of when ranges over all stationary joinings of and .

The two key properties of the -distance that we shall use are: On the one hand, the fact that entropy of processes close in -distance can be compared (Lemma 3.1 below). On the other hand, a practical tool to estimate the -distance between processes using conditional distributions on the past: If, for all large enough ,

(1)

for all past outside a set of measure and all past outside a set of measure , then . Moreover, the same conclusion holds if we replace in (1) the conditional distributions with respect to the past by conditional distributions with respect to finer -algebras.

Lemma 3.1.

Let be a stationary process taking values in a countable alphabet which has finite entropy. For any , there exists such that any stationary process taking values in the same alphabet with satisfies .

Proof.

Without loss of generality we can assume . For any integer we define the process taking values in the finite alphabet by

We choose large enough so that . Then we use the fact that entropy is a continuous function of processes taking values in a given finite alphabet, when these processes are topologized with the -distance (see e.g. [9] page 100). Therefore we can find such that any process taking values in at -distance at most from has entropy at least . Now, if , then (where is defined from in a similar way), hence . ∎

3.2. Comparison between connected and disconnected processes

Let be a finite alphabet, and be two probability measures on and be a joining of and . Let be some fixed positive real number. We define two processes and on .

The process is constructed from two independent sequences of i.i.d. random variables distributed according to the Poisson distribution of parameter , which can be interpreted as numbers of black and white particles lying on each site of . Then to each black (respectively white) particle we randomly and independently associate a label picked in according to (respectively ). For any and any labels , (respectively ) is the total number of black (respectively white) particles labelled by (respectively ) at position . In other words, the process associates in an i.i.d. way to each site a finite sequence of independent random variables respectively distributed according to the Poisson distribution of parameter and .

Let and be two integers. The process is also constructed from black and white particles on , but which are no longer independent. The number of black particles at each site is given by a sequence of i.i.d. random variables distributed according to the Poisson distribution of parameter . For each black particle at position , we first pick a random integer , uniformly in and independently of all other particles. Then we link the black particle to a white particle that we put at position . For each such couple of black and white particles, a couple of labels is picked in with probability : The first label is associated to the black particle and the second label to the white one. Then, for any , denote the number of black particle labelled by and the number of white particles labelled by at position .

Lemma 3.2.

For any , if is large enough, .

Proof of Lemma 3.2, simple case.

We first prove the lemma in the case where is reduced to a singleton. Since all particles have the same label, we just forget it and simply count the number of black and white particles on each site.

We now prove that if is large enough, for any ,

with probability on . Since is Poisson distributed with parameter , and independent from , and since , is independent from conditionally to the past, it is enough to prove that if is large enough, for any ,

with probability on .

In fact we rather condition with respect to an enriched past (see Figure 2): Assume that besides the number of black and white particles on each site , we also know which the links are between them.

All black particlesare free, but have noinfluence on 0?-10Only non-freeblack particles:no influence on 0Here, free black particlesmay be linked to awhite particle at 0
Figure 2. The enriched past.

From what we know, we can distinguish two kinds of black particles between and : Those which are linked to a white particle lying on the left of 0, and those, called free particles, whose white particle’s position is unknown. Observe that only free particles may have some influence on . Hence, black particles lying on the left of site have no influence on since they are not free. Black particles lying on the right of site are free but nevertheless have no influence on since a black particle is linked to a white particle at distance at least .

So it remains to study the influence on of free black particles at sites between and . For let us denote by the number of free particles at site . Any black particle at site has probability to be free. Therefore, follows the Poisson distribution with parameter .

Fix . Assume there is a free particle at site . Since there are possible positions for its white particle, the latter has probability to lie at site 0. Hence, conditionally to our enriched past, the number of white particles at site 0 can be written as

where are independent Bernoulli random variables with respective parameter . The law of such a sum of independent Bernoulli variables is close to a Poisson distribution of parameter as soon as the sum of the parameters is close to and all parameters are small enough (see [2], Theorem 23.2 page 312). Therefore we can choose a large enough integer , and small enough so that, if is a sum of independent Bernoulli random variables, each with parameter less than and such that the sum of parameters is within of , then

We have now to avoid bad configurations, that is configurations of the enriched past which have free particles close to giving rise to Bernoulli with large parameters, and configurations such that the sum of the parameters is not close enough to .

Control of the parameters’ size

We compute the probability that no free particle lie between and :

Under this condition, is (conditionnally to the enriched past) the sum of independent Bernoulli variables with parameters smaller than . If is large enough, this happens with probability larger than .

Control of the parameters’ sum

Since the are independent and Poisson distributed with parameter , the expected value of the sum of the parameters is , and its variance is

Hence, if is large enough,

Putting things together, we have proved that with probability larger than on the enriched past, the conditional distribution of the number of white particles at site 0 satisfies

This proves Lemma 3.2 when is reduced to a singleton. ∎

Proof of Lemma 3.2, general case.

We consider the family of independent processes , , which counts the number of black and white particles at belonging to a pair of black and white particles respectively labelled by and . Then is a simple-case process, for which the expected number of black particles per site is . From the proof in the simple case, we know that as soon as is large enough, the -distance between and is smaller than , where is composed of two i.i.d. sequences of Poisson random variables of parameter . We can recover and by

On the other hand, (respectively ) has the same distribution as (respectively ): It is an i.i.d. sequence of Poisson random variables of parameter (respectively ). Summing over and , it follows that . ∎

4. Positive Poisson entropy

We denote by the stationary process living in the Poisson suspension of our transformation , defined by

The purpose of this section is to show that the entropy of the process is positive as soon as the ’s are chosen large enough. This will be proved by showing that the -distance between and an i.i.d. sequence of random Poisson variables with parameter 1 can be made as small as we want. By Lemma 3.1, this will be enough to conclude.

Our strategy is the following: As one goes along in the construction of the return time to , we define a sequence of infinite-measure-preserving transformations, approximating the final transformation . Then we consider the process living in the Poisson suspension of :

The transformation is constructed by stacking infinitely many pairwise disjoints intervals of length 1, being one of them, into a doubly infinite tower, each interval being mapped onto the one just above. Therefore, as mentioned previously, is an i.i.d. sequence of Poisson variables with parameter 1. At step of the construction, we define the return time to on the subset . This return time to will be the same for all transformations , , and for the final transformation . By choosing the return time adequately, we will make sure that

so that for all

(2)

Let us describe the first step. Recall that Tower 1 is of height 2, with basis (See Figure 1). We cut into equal subintervals, and we define on the return time to to be on the -th subinterval. We insert Tower 1 into a doubly infinite tower of intervals of length 1/2 and add spacers between and its image by the odometer : We insert spacers of width between the -th subinterval of and its image by . The transformation is defined by mapping each point to the point right above it.

The process counts the number of particles in at time for the Poisson suspension over : We interpret points in as black particles and points in as white particles. For the suspension over , black and white particles are independent, whereas for the suspension over , they are linked through the return time to of the point corresponding to the black particle. Hence, a direct application of Lemma 3.2 in the simple case with no alphabet gives that when is large enough.

Step 0Step 1Step 2
Figure 3. First steps in the construction of the return time to .

Suppose the return time to has already been defined on all , . Consider Tower . The return time to has already been defined on all rungs but the roof and (which is the rung of level ). For each point in , let be the sequence of the return times to when we climb the first half of Tower  before reaching , and let be the sequence of the return times to when we climb the second half of Tower  starting from . We denote by the return time to , which is to be defined at this step. We want to be independent of and . To this end, we consider the finite partition of generated by and . Each atom of this partition is cut into equal pieces, and we define to be on the -th piece of each atom. Here is how we define the transformation : we insert Tower into a doubly infinite tower of intervals of length and insert as many spacers as we need between the rungs of Tower to achieve the already defined return time to . The transformation maps each point to the point right above it.

Let us turn to the estimation of for . We want to apply Lemma 3.2: Black particles are points in and white particles are points in . Let be the maximum value of the already defined return time to on . We consider the finite alphabet . To each point in , we associate the label of the return times to when we climb the first half of Tower  before reaching . To each point in , we attach the label , which is the sequence of the return times to when we climb the second half of Tower  starting from . Let the process (respectively ) count the numbers of black and white particles together with their label in the suspension over (respectively ). These processes are exactly of the form studied in Lemma 3.2.

Now, observe that we can recover and from and :

and

It follows that, if and coincide on , then . By Lemma 3.2, can be made arbitrarily small by choosing large enough. Hence we can assure that .

Finally, note that since is increasing, the return time to on the roof of Tower  (the union of , ) will be larger than . Hence, for any , if is large enough so that , the distribution of coincides with the distribution of . Therefore,

which implies

5. Comments and open questions

In view of previously known results on the subject, some comments on the infinite-measure-preserving transformation constructed in Section 2 may be made.

First, although its construction is derived from the standard cutting-and-stacking procedure used to build the most elementary rank-one system (the Von Neumann-Kakutani odometer), the transformation is not even of finite rank. Indeed, Proposition 10.1 in [3] shows that for finite-rank systems, both the Poisson and Krengel entropy vanish.

Second, it was also proved in [3] that Krengel and Poisson entropies coincide for quasi-finite transformations, namely transformations for which there exists a sweep-out set of measure 1 such that the return-time partition of has finite entropy. There exist only few examples of transformation for which the non quasi-finiteness has been established: an unpublished example constructed by Ornstein has been mentioned by Krengel in [5], and the only published example which we are aware of is a rank-one system, given by Aaronson and Park in [1]. Our construction thus provides a new example of a non quasi-finite transformation.


After having proved that the different notions of entropy for infinite-measure-preserving transformations do not always coincide, a natural question is to ask whether they are always ordered in the same way: Is it true that Poisson entropy always dominates Krengel entropy? Can we at least decide whether zero Poisson entropy implies zero Krengel entropy? And what about similar questions regarding the comparison between Parry entropy and Poisson entropy? It may be worth recalling here that the equality of Parry entropy and Krengel entropy in the quasi-finite case was proved by Parry in 1969 [7], but that the question whether they always coincide is, as far as we know, still open.

Acknowledgements

The construction of the transformation in Section 2 has been inspired by a private communication of Benjamin Weiss concerning the Shannon-McMillan-Breiman theorem.

We are also much indebted to Emmanuel Roy for stimulating conversations on the subject.

References

  1. Jon Aaronson and Kyewon Koh Park, Predictability, entropy and information of infinite transformations, Fund. Math. 206 (2009), 1–21.
  2. Patrick Billingsley, Probability and measure, second ed., Wiley Series in Probability and Mathematical Statistics: Probability and Mathematical Statistics, John Wiley & Sons Inc., New York, 1986.
  3. Élise Janvresse, Tom Meyerovitch, Emmanuel Roy, and Thierry de la Rue, Poisson suspensions and entropy for infinite transformations, Trans. Amer. Math. Soc. 362 (2010), no. 6, 3069–3094.
  4. Ulrich Krengel, Entropy of conservative transformations, Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 7 (1967), 161–181.
  5. by same author, On certain analogous difficulties in the investigation of flows in a probability space and of transformations in an infinite measure space, Functional Analysis (Proc. Sympos., Monterey, Calif., 1969), Academic Press, New York, 1969, pp. 75–91.
  6. Donald S. Ornstein, Ergodic theory, randomness, and dynamical systems, Yale University Press, New Haven, Conn., 1974, James K. Whittemore Lectures in Mathematics given at Yale University, Yale Mathematical Monographs, No. 5.
  7. William Parry, Entropy and generators in ergodic theory, W. A. Benjamin, Inc., New York-Amsterdam, 1969.
  8. Emmanuel Roy, Mesures de poisson, infinie divisibilité et propriétés ergodiques, Ph.D. thesis, 2005.
  9. Paul C. Shields, The ergodic theory of discrete sample paths, Graduate Studies in Mathematics, vol. 13, American Mathematical Society, Providence, RI, 1996.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
130140
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description