Finite, integrable and bounded time embeddings for diffusions
Abstract
We solve the Skorokhod embedding problem (SEP) for a general timehomogeneous diffusion : given a distribution , we construct a stopping time such that the stopped process has the distribution . Our solution method makes use of martingale representations (in a similar way to Bass (In Seminar on Probability XVII. Lecture Notes in Math. 784 (1983) 221–224 Springer) who solves the SEP for Brownian motion) and draws on law uniqueness of weak solutions of SDEs.
Then we ask if there exist solutions of the SEP which are respectively finite almost surely, integrable or bounded, and when does our proposed construction have these properties. We provide conditions that guarantee existence of finite time solutions. Then, we fully characterize the distributions that can be embedded with integrable stopping times. Finally, we derive necessary, respectively sufficient, conditions under which there exists a bounded embedding.
0 \volume21 \issue2 2015 \firstpage1067 \lastpage1088 \doi10.3150/14BEJ598 \newremarkremarkRemark \newproclaimassumptionStanding Assumption
Finite, integrable and bounded time embeddings for diffusions
A]\initsS.\fnmsStefan \snmAnkirchner\corref\thanksrefA,e1label=e1,mark]ankirchner@hcm.unibonn.de, B]\initsD.\fnmsDavid \snmHobson\thanksrefBlabel=e2]D.Hobson@warwick.ac.uk and A]\initsP.\fnmsPhilipp \snmStrack\thanksrefA,e3label=e3,mark]pstrack@unibonn.de
bounded time embedding \kwdSkorokhod’s embedding theorem
Introduction
Let be a onedimensional timehomogeneous diffusion, and let be a probability measure on . The Skorokhod embedding problem (SEP) for in is to find a stopping time such that . Our main goals in this article are firstly to construct a solution of the Skorokhod embedding problem, and secondly to discuss when does there exist a solution which is finite, integrable or bounded in time, and when does our construction have these properties.
Our construction is based on the observation that we can remove the drift of the timehomogeneous diffusion by changing the space variable via a scale function. We can thus simplify the embedding problem to the case where is a local martingale diffusion. We then consider a random variable that has the distribution we want to embed and that can be represented as a Brownian martingale on the time interval . Further, we set up an ODE that uniquely determines a timechange along every path of . We then show, by drawing on a result of uniqueness in law for weak solutions of SDEs, that the timechanged diffusion has the same distribution as the martingale . Thus the timechange provides a solution of the SEP.
Our solution is a generalization of Bass’s solution of the SEP for Brownian motion (see [3]). Bass also starts with the martingale representation of a random variable with the given distribution. By changing the martingale’s clock, he obtains a Brownian motion and an associated embedding stopping time. The timechange is governed by an ODE, a special case of our ODE, which establishes an analytic link between Brownian paths and the embedding stopping time. This link yields embedding stopping times for arbitrary Brownian motions.
Now consider properties of solutions of the SEP. As is well known from the literature, whether a distribution is embeddable into the diffusion depends on the relation between the support of the distribution and the state space of and the relation between the initial value and the first moment of the distribution. We include in our analysis a general discussion of sufficient and necessary conditions for the existence of finite embedding stopping times, with particular reference to our proposed construction.
Next, we fully determine the collection of distributions that can be embedded in with integrable stopping times. The associated conditions involve an integrability condition on the target distribution which makes use of a function that also appears in Feller’s test for explosions (see, e.g., [10]).
Finally, we address the question of whether a distribution can be embedded in bounded time. Recall that the Root solution ([18]) of the SEP has the property that it minimises uniformly in . The Root solution is of the form where is a ‘barrier’; in particular for some suitably regular function depending on the target law. Hence, a necessary and sufficient condition for there to be an embedding with is that . However, the Root barrier is nonconstructive and difficult to analyse (though for some recent progress in this direction see Cox and Wang [7] and Oberhauser and Dos Reis [13]). For this reason, instead of searching for a single set of necessary and sufficient conditions we limit ourselves to finding separate sets of necessary conditions and sufficient conditions.
Our original motivation in developing a solution of the SEP for diffusions was to study bounded stopping times with the aim of providing simple sufficient conditions for the existence of a bounded embedding. The boundedness (finiteness) of an embedding is an important property of the embedding used to solve the gambling in contests problem of Seel and Strack [19], and is also relevant in the modelindependent pricing of variance swaps, see Carr and Lee [4], Hobson [9] and Cox and Wang [7].
Consider for a moment the case where is a realvalued Brownian motion, null at 0. Then it is possible to embed any target probability measure in . Moreover, can be embedded in integrable time if and only if is centred and in , and then . The case of embeddings in bounded time is more subtle. Clearly a necessary condition for there to exist an embedding of in such that is that is smaller that in convex order, where is the law of a standard Gaussian. But this is not sufficient. Let be the uniform measure on . Then is smaller than in convex order if and only if . But any embedding of has , and thus does not satisfy for any . Hence, we would like to find sufficient conditions on such that there exists with . The case where is Brownian motion, possibly with drift, was considered in Ankirchner and Strack [1]. Here we consider general timehomogeneous diffusions.
The paper is organized as follows. In Section 1, we describe our solution method of the SEP for a diffusion without drift. In this section, we assume that the initial value of the diffusion coincides with the first moment of the distribution to embed (the centred case). In the following Section 2, we briefly explain how to construct solutions if the first moment does not match the initial value (the noncentred case). In Section 3, we collect some general conditions which guarantee that a distribution can be embedded into in finite time. We then consider integrable embeddings in Section 4. Distinguishing between the centred and noncentred case, we provide sufficient and necessary conditions for the existence of integrable solutions of the SEP. Section 5 discusses bounded embeddings. In Section 6, we explain how one can reduce the SEP for diffusions with drift to the case without drift. Finally, in Section 7 we illustrate our results with some examples.
1 The martingale case
We will argue in Section 6 below that the problem of interest can be reduced to the case in which the process is a continuous local martingale. In this section, we describe a generalisation of the Bass [3] solution of the SEP. The Bass solution is an embedding of in Brownian motion: we consider embeddings in a local martingale diffusion which may be thought of as timechanged Brownian motion.
Consider the timehomogeneous local martingale diffusion , where solves
(1) 
here and is Borelmeasurable. We assume that the set of points with coincides with the set of points where is not locally square integrable. Then a result by Engelbert and Schmidt implies that the SDE (1) possesses a weak solution that is unique in law (see, e.g., Theorem 5.5.4 in [10]). We define and so that (to exclude trivialities we assume ) and for ,
(2) 
By our assumption on , is infinite on and .
By Feller’s test, for one, and then every, if and only if . Similarly, if and only if (see, e.g., Theorem 5.5.29 in [10]). Further, by results of Kotani [11], the local martingale is a martingale provided either or and either or .
Note that our assumption that is not locally square integrable at and implies that and are absorbing boundaries if they can be reached in finite time. Then without loss of generality we may assume that on and and is positive on .
We want to embed a nonDirac probability measure with . Let and be the extremes of the support of , and let be the distribution function associated to the target law . Moreover, let be the cumulative distribution function of the normal distribution and its density. Define the function . Let be a Brownian motion on a filtration . Notice that has the distribution . In particular, is integrable and .
We define the martingale for . Notice that , has distribution and , where
and is the density of the normal distribution with variance .
Since is not a Dirac measure we have that is increasing somewhere, and hence, for all , the mapping is strictly increasing. Thus, we can define the inverse function implicitly by
(3) 
moreover we set . The process solves the SDE
(4) 
Define
(5) 
The candidate embedding which we want to discuss is where solves
(6) 
Note that is increasing so that if is defined on then we can set .
Theorem 1
If the ODE (6) has a solution on for almost all paths of , then embeds into , that is, the law of is .
Let for all . By interchanging the timechange and integration, see, for example, Proposition V.1.5 in [15], we get
Let , for . Notice that (Proposition V.1.5 in [15]) and then by Lévy’s characterization theorem, is a Brownian motion on . Next, observe that
which shows that solves the SDE (4) with replaced by ; in other words is a weak solution of (4).
It follows directly from Lemma 2(a) in Bass [3] that is Lipschitz continuous in , uniformly in , on compact subsets of . Therefore, the SDE (4) has at most one strong solution on and hence (4) is pathwise unique, from which it follows (see, e.g., Section 5.3 in [10]) that solutions of (4) are unique in law. Hence, for , has the same distribution as , and in the limit tends to 1 we have and hence has law .
Notice that the assumption that is crucial for the conclusion of Theorem 1. Indeed, if , then and solve the same SDE, but with different initial conditions. Hence, one cannot derive that has the same distribution as .
We next aim at showing that is a stopping time with respect to , the smallest filtration containing the filtration generated by the martingale and satisfying the usual conditions. To this end we consider, as in [3], the ODE satisfied by the inverse of . The ODE for the inverse is Lipschitz continuous and hence guarantees that Picard iterations converge to a unique solution.
Lemma 1
Assume that there exists a solution of (6) on . Set and define for all . Then a straightforward calculation shows that satisfies (8).
The reverse direction can be shown similarly.
Lemma 2
Suppose the ODE (6) has a solution on for almost all paths of . Then is an stopping time, for all .
Let be a compact subset of . By Lemma 2 of Bass [3], and are Lipschitz continuous on . Moreover, on the function is bounded away from zero and bounded from above. This implies that is Lipschitz continuous on , too.
Define the mapping . Now let be a compact subset of . Then there exists an such that for all and we have
that is, is Lipschitz continuous in the second argument.
We define the Picard iterations and for ,
We have that after the first time where attains . The assumptions on guarantee that is finite, a.s. for each (see, e.g., Section 5.5 in [10]). By standard arguments, one can now show that converges to a limit on , where for all . In particular, is measurable; moreover solves the ODE (8) on .
Now let and . Observe that
The RHS is measurable, which implies that is an stopping time. The limit is also an stopping time.
Lemma 3
There exists a solution of (6) on for almost all paths of if and only if , a.s. for all . In this case, has the same distribution as .
For all let and . By appealing to uniqueness in law of solutions of (4) one can show, as in the proof of Theorem 1, that and have the same distribution. Moreover, and have the same distribution, and therefore, if and only if .
Recall (Monroe [12]) that a solution of the SEP for in is minimal if whenever is a solution of the SEP for in such that then almost surely. The following result shows that is minimal, provided it exists. In particular, the Bass embedding [3] is minimal.
Proposition 1
Suppose almost surely, for every , or equivalently almost surely for each . Then is a minimal embedding of in .
We have is uniformly integrable (UI). Since, by construction , it follows that is UI. But and for some timechange and some Brownian motion and hence is UI. Monroe [12, Theorem 3] proves that in the Brownian case, if is an embedding of in a Brownian motion and if then is minimal if and only if is UI. Hence, is minimal for in . Since is increasing we can conclude that is minimal for in .
Theorem 2
Suppose . Recall and suppose and . Then exists and is a minimal embedding of in .
For , and since is locally square integrable almost surely. Hence, exists and is finite for each and has law .
Suppose places mass outside . Then it is clear that it is not possible to embed in using any embedding. To see that this holds true for , suppose . Then for each we have and there exists a continuous function such that for . Then, for all such that . Since the set has positive probability, explodes before time 1 with positive probability also.
Henceforth, we will assume that places no mass outside .
Recall that we have assumed that we are given a diffusion with , and that the target measure satisfies and . We call this the centred case. In the next section, we consider what happens if we relax this assumption.
In the case where but , we introduce an embedding which involves running the martingale until it first hits and then using the stopping time defined above, but for started at .
Then in subsequent sections we will ask, when does there exist a finite (respectively {integrable, bounded}) embedding, and when does or more generally have this property.
2 The noncentred case
In this section, we do not assume that and that .
When at least one of and is finite we write . Note that we assume that has support in the state space of .
Proposition 2 ((Pedersen and Peskir [14], Cox and Hobson [5]))
Suppose . Then for there to be an embedding of in we must have that . In this case is a uniformly integrable martingale.
Suppose . Then there exists an embedding of in if and only if . Conversely, if there exists an embedding of in if and only if .
Finally, suppose . Then we can embed any distribution in .
In the bounded case, the fact that is a bounded local martingale gives that it is a UImartingale, and hence .
For the second case, the upper bound on the state space means that is a submartingale so that the condition is necessary. Then provided we can run until it first reaches . Note that hits in finite time by the argument in Karatzas, Shreve [10], Section 5.5 C. Then we can embed using the local martingale started from (using, for example, the time defined above, or the Azéma–Yor construction as in Pedersen and Peskir [14]). If is infinite, then we need a different construction, see, for example, Cox and Hobson [5].
For the final case, any distribution can be embedded in . If then we can run until it hits and then consider an embedding for the local martingale started at the mean of the target distribution. If , then we can use the construction in [5], but not the one in this paper.
Let be the first hitting time of by , and more generally let be the first hitting time of by a stochastic process . Suppose and let be the stopping time constructed in the previous section to embed in the timehomogeneous diffusion started at . Then let . By the results of the proposition, provided and both if and if , then is an embedding of .
3 Finite embeddings
3.1 The centred case
Suppose and .
Proposition 3

[(ii)]

If , does not hit in finite time and or if , does not hit in finite time and , then any embedding of has with positive probability.

Otherwise, either , or does not hit in finite time and or can hit in finite time and either , or does not hit in finite time and or can hit in finite time. Then if is an embedding of we have that is also an embedding of and is finite almost surely.
(i) Suppose is any embedding of in . Then on the set where . Moreover, this set has positive probability by assumption.
(ii) If is an embedding of , then converges almost surely, even on the set . However, if then by the Rogozin trichotomy (see [17]), and does not converge. Hence, we must have .
Otherwise, one or both of is finite. Then converges and so if then either or .
If or is finite but hits neither nor in finite time, then is excluded outside a set of measure zero by the hypothesis that and . Hence, almost surely.
Finally, if can hit either or in finite time then it will do so and .
Corollary 1
If there exists an embedding of in which is finite almost surely then is finite almost surely.
If there is a finite embedding, then we must be in case (ii) of the proposition. Then is finite almost surely. But also so that almost surely.
3.2 The noncentred case
Suppose and are such that an embedding exists (recall Proposition 2). Necessarily we must have that at least one of and is infinite.
Suppose so that and are well defined. Then since is finite almost surely, we have that is finite if and only if is finite almost surely.
Then the result for the noncentred case is identical to both the proposition and the corollary describing the results in the centred case, modulo the substitution of for in Corollary 1.
4 Integrable embeddings
4.1 The centred case
Suppose and .
In this section, we provide an integrability condition on that guarantees that (6) has a solution on and that is integrable. Notice that is twice continuously differentiable on . The second derivative
is positive, which means that is convex. Moreover, is decreasing on and increasing on ; in particular .
Theorem 3
If the function is integrable wrt , then the ODE (6) has a solution on for almost all paths of and is integrable. In this case, .
Assume first that is integrable wrt . This means that the random variable is integrable. Let
(9) 
and observe that is bounded away from and for any , and hence , a.s. By Itô’s formula, and using , we get
Taking expectations, the martingale part disappears and we obtain
(10) 
Notice that Jensen’s inequality implies that
Since the family is uniformly integrable, also is uniformly integrable. Therefore we can interchange the expectation operator and the limit on the RHS of (10). By monotone convergence, we can do so also on the LHS and hence we get
Lemma 3 implies that the ODE (6) has a solution on for almost all paths of and that is integrable.
The reverse statement of Theorem 3 holds true aswell, that is, if is integrable, then is integrable wrt . Indeed, we next show that the existence of an arbitrary integrable solution of the SEP implies that is integrable wrt .
Proposition 4
Any stopping time that solves the SEP satisfies
(11) 
Recall that . Since is absorbing if and similarly if then is absorbing, we have that and is also an embedding of .
Let be an stopping time with . Suppose that is integrable; else the statement is trivial. Let
Notice that if attains the boundary with positive probability in finite time, then the function is finite at . In this case can have mass on . If does not attain the boundary in finite time, then obviously a distribution with mass in can not be embedded with an integrable stopping time. Similar considerations apply at the right boundary .
Corollary 2
Suppose and . There exists an integrable solution of the SEP if and only if is integrable wrt . In this case, satisfies (11).
Corollary 3
Suppose and . Whenever there exists an integrable solution of the SEP, then is also an integrable solution.
4.2 The noncentred case
Suppose we are given a local martingale diffusion started at and a measure with .
Recall the definition of in (2). To emphasise the role of the initial point, write for this expression. More generally, for define
(13) 
Then . As , in particular
and . Hence, is finite if and only if is finite.
We state the following theorem in the case which necessitates , and then . There is a corresponding result for in which the condition is replaced by . Note that the limit is well defined because is convex.
Theorem 4
Suppose .
Suppose and . Then is an integrable embedding of .
Conversely, suppose there exists an integrable embedding of in . Then and .
Consider the first part of the theorem. By the comments before the theorem, we may assume that and hence, for started at , . Then, it is sufficient to show that . But
which is finite under the assumption that .
For the converse, suppose that is an integrable embedding. Without loss of generality, we may assume that is minimal; if not we may replace it with a smaller embedding which is also integrable. Then
It remains to show that . Recall that this is equivalent to the condition .
Recall that by the Dubins–Schwarz theorem (Rogers and Williams [16], page 64) we can write for a Brownian motion where . Let . Then and embeds in .
Since is a minimal embedding of in , by Theorem 5 of Cox and Hobson [6]
(14) 
Moreover, by arguments in the proof of Lemma 11 of Cox and Hobson [6], for any stopping time
Hence, is bounded in , and then by Theorem 1 of Azéma et al. [2], is uniformly integrable if and only if . Since is not centred and is not UI, it follows from (14) that . But so .
Then
Then, if it follows that and .
Finally, we consider the case where .
Lemma 4
Suppose . If is an embedding of , then is not integrable.
Observe that and that if then since is convex we must have . Then if is an embedding of