Asymptotic Properties of the Empirical Spatial Extremogram

Yongbum Cho and Richard A. Davis

Department of Statistics, Columbia University

Souvik Ghosh

LinkedIn Corporation

ABSTRACT. The extremogram is a useful tool for measuring extremal dependence and checking model adequacy in a time series. We define the extremogram in the spatial domain when the data is observed on a lattice or at locations distributed as a Poisson point process in -dimensional space. We establish a central limit theorem for the empirical spatial extremogram. We show these conditions are applicable for max-moving average processes and Brown-Resnick processes and illustrate the empirical extremogram’s performance via simulation. We also demonstrate its practical use with a data set related to rainfall in a region in Florida.

Keywords: extremal dependence; extremogram; max moving average; max stable process; spatial dependence

1 Introduction

Extreme events can affect our lives in many dimensions. Events like large swings in financial markets or extreme weather conditions such as floods and hurricanes can cause large financial/property losses and numerous casualties. Extreme events often appear to cluster and that has resulted in a growing interest in measuring extremal dependence in many areas including finance, insurance, and atmospheric science.

Extremal dependence between two random vectors and can be viewed as the probability that is extreme given belongs to an extreme set. The extremogram, proposed by Richard09, is a versatile tool for assessing extremal dependence in a stationary time series. The extremogram has two main features:

  • It can be viewed as the extreme-value analog of the autocorrelation function of a stationary time series, i.e., extremal dependence is expressed as a function of lag.

  • It allows for measuring dependence between random variables belonging in a large variety of extremal sets. Depending on choices of sets, many of the commonly used extremal dependence measures - right tail dependence, left tail dependence, or dependence among large absolute values - can be treated as a special case of the extremogram. The flexibility coming from arbitrary choices of extreme sets have made it especially well suited for time series applications such as high-frequency FX rates (Richard09), cross-sectional stock indices (Richard10), and CDS spreads (CDS(extremogram)).

In this paper, we will define the notion of the extremogram for random fields defined on for some and investigate the asymptotic properties of its corresponding empirical estimate. Let be a stationary -valued random field. For measurable sets bounded away from 0, we define the spatial extremogram as

(1.1)

provided the limit exists. We call (1.1) the spatial extremogram to emphasize that it is for a random field in If one takes in the case, then we recover the tail dependence coefficient between and For light tailed time series, such as stationary Gaussian processes, for in which case there is no extremal dependence. However, for heavy tailed processes in either time or space, is often non-zero for many lags and for most choices of sets and bounded away from the origin.

We will consider estimates of under two different sampling scenarios. In the first, observations are taken on the lattice . Analogous to Richard09, we define the empirical spatial extremogram (ESE) as

(1.2)

where

  • is the -dimensional cube with side length ,

  • are observed lags in ,

  • is an increasing sequence satisfying and as ,

  • is a sequence such that ,

  • is the number of pairs in with lag , and

  • is the cardinality of .

In the second case, the data are assumed to come from a stationary random field where the locations are assumed to be points of a homogeneous Poisson point process on . We define the empirical spatial extremogram as a kernel estimator of , in the spirit of the estimate of autocorrelation in space (see lgs). Under suitable growth conditions on and restrictions on the kernel function, we show that the weighted estimator of is consistent and asymptotically normal.

The organization of the paper is as follows: In Section 2, we present the asymptotic properties of the ESE for both cases described above. Section 3 provides examples illustrating the results of Section 2 together with a simulation study demonstrating the performance of the ESE. In Section 4, the spatial extremogram is applied to a spatial rainfall data set in Florida. The proofs of all the results are in the Appendix.

2 Asymptotics of the ESE

2.1 Definitions and notation

Let be a -dimensional strictly stationary random process where is either or . For , we use to denote . The random field is said to be regularly varying with index if for any the radial part satisfies for all

(C1)

and the angular part is asymptotically independent of the radial part for large values of , i.e., there exists a random vector , the unit sphere in with respect to , such that

(C2)

where denotes weak convergence. The distribution of is called the spectral measure of .

An equivalent definition of regular variation is given as follows. There exists a sequence and a family of non-null Radon measures on the Borel -field of such that for , where the limiting measure satisfies for . Here, denotes vague convergence. Under the regularly varying assumption, one can show that (1.1) is well defined. See Section 6.1 of Resnick(HT) for more details.

2.2 Random fields on a lattice

Let be a strictly stationary random field and suppose we have observations . Let be a metric on . We denote the -mixing coefficient by

where for any two -fields and , and for any , .

In order to study asymptotic properties of (1.2), we impose regularly varying and certain mixing conditions on the random field. In particular, we use the big/small block argument: the side length of big blocks, , and the distance between big blocks, , have to be coordinated in the right fashion. To be precise, we assume the following conditions.
 
(M1) Let be the ball of radius centered at 0, i.e., and set . For a fixed , assume that there exist with , such that

(2.1)
(2.2)
(2.3)
(2.4)

where satisfies

Condition (2.1) restricts the joint distributions for exceedance as two sets of points become far apart. Conditions (2.2) - (2.4) impose restrictions on the decaying rate of the mixing functions together with the level of the threshold specified by . These conditions are similar to those in Bolthausen and Richard09.

As in Richard09, the ESE is centered by the Pre-Asymptotic (PA) extremogram

(2.5)

where and . Notice that (2.5) is the ratio of the expected value of the numerator and denominator in (1.2).

Theorem 2.1.

Suppose a strictly stationary regularly varying random field with index is observed on . For any finite set of non-zero lags in , assume (M1), where for some . Then

where the matrix in normal distribution is specified in Appendix A.

We present the proof of Theorem 2.1 in Appendix A. Examples of heavy-tailed processes satisfying (M1) are presented in Section 3.

Remark 1.

In Theorem 2.1, the pre-asymptotic extremogram is replaced by the extremogram if

(2.6)

2.3 Random fields on

Now consider the case of a random field defined on and the sampling locations are given by points of a Poisson process. In this case, we adopt the ideas from karr and lgs and use a kernel estimate of the extremogram. For convenience, we restrict our attention to . The extension to is straightforward, but notationally more complex.

Let be a stationary regularly varying random field with index . Suppose is a homogeneous 2-dimensional Poisson process with intensity parameter and is independent of . Define . Now consider a sequence of compact and convex sets with Lebesgue measure as . Assume that for each

(2.7)

where ,

(2.8)

and denotes the boundary of .

The spatial extremogram in (1.1) is estimated by where

(2.9)
(2.10)

Here is a sequence of weight functions, where on is a positive, bounded, isotropic probability density function and is the bandwidth satisfying and To establish a central limit theorem for , we derive asymptotics of the denominator and numerator . In order to show consistency of , we assume the following conditions, which are the non-lattice analogs of (2.1) and (2.2).
 
(M2) There exist an increasing sequence and with and such that

(2.11)
(2.12)
(2.13)

where and .

For a central limit theorem for , the following conditions are required.
 
(M3) Consider a cube with and for Assume that there exist an increasing sequence with and such that

(2.14)

where is the quantity (2.10) with replaced by on the right-hand side. Further assume

(2.15)

and

(2.16)

Lastly, the proof requires some smoothness of the random field.

Definition 2.2.

A stationary regularly varying random field satisfies a local uniform negligibility condition (LUNC) if for an increasing sequence satisfying and for all , there exists such that

(2.17)
Theorem 2.3.

Let be a stationary regularly varying random field with index satisfying LUNC. Assume is a homogeneous 2-dimensional Poisson process with intensity parameter and is independent of . Consider a sequence of compact and convex sets satisfying as . Assume conditions (M2) and (M3). Then for any finite set of non-zero lags in ,

(2.18)

where the matrix is specified in the proof of Theorem 2.1 in Appendix A.

We present the proof of Theorem 2.3 in Appendix B. As in Remark 1, can be replaced by if converges fast enough.

Remark 2.

In (2.18), can be replaced by if

(2.19)

3 Examples

Here we provide two max-stable processes to illustrate the results of Section 2. For background on max-stable processes, see deHaan(1984) and deHaan(2007). In order to check conditions, we need the result from dombry.

Proposition 3.1 (dombry).

Suppose is a max-stable random field with unit Fréchet marginals. If and are finite or countable disjoint closed subsets of , and and are the respective -fields generated by each set, then

(3.1)

where is the -mixing coefficient. We refer to Lemma 2 in christina.

Notice that (3.1) provides the upper bound for -mixing coefficient since See Bradley.

3.1 Max Moving Average (MMA)

Let be an iid sequence of unit Fréchet random variables. The max-moving average (MMA) process is defined by

(3.2)

where . Note that the summability of implies the process is well defined. Also, notice that since marginal distributions are Fréchet. Consider the Euclidean metric and write for notational convenience. With , the process (3.2) becomes the MMA(1): . Using , the extremogram for the MMA(1) is then

(3.3)

Since the process is 2-dependent, conditions for Theorem 2.1 are easily checked.

Figure 1 (left) shows and from a realization of MMA(1) generated by rmaxstab in the SpatialExtremes package111http://cran.r-project.org/web/packages/SpatialExtremes/SpatialExtremes.pdf in R. We use 1600 points () and set and .97 quantile of the process. In the figure, the dots and the bars correspond to and for observed distances in the sample. The dashed line corresponds to and two horizontal lines are 95% random permutation confidence bands to check the existence of extremal dependence (see Richard10). The bands suggest for , which is consistent with (3.3).

Figure 1: and , where from a realization of an MMA(1) (left) and the process (3.4) (right). For the ESE, .97 (left) and (.90,.92,.95,.97) quantile (right) are used. For both cases, the ESE closely tracks the extremogram. Two horizontal lines are 95% random permutation confidence bands.

Now consider where . Then the process (3.2) becomes

(3.4)

where . Observe that the process (3.4) is istotropic and that from Lemma A.1 in jenish, and

(3.5)
(3.6)

where , the number of observations with minimum distance to 0 or equals . For a given , if , there are pairs from both 0 and while as . In other words,

for and .

Using the joint distribution in (3.6) and a Taylor series expansion, the extremogram with is

(3.7)
Example 3.2.

For the process (3.4), the conditions (2.1)-(2.4) in Theorem 2.1 are satisfied if and .

Proof.

Observe that (3.4) is isotropic. By Lemma A.1 in jenish, Thus, (3.1) implies that

for any .

Then (2.2) is satisfied if since

Similarly, (2.3) can be shown. If , (2.4) holds since (3.1) implies

Turning to (2.1), notice from (3.5) and (3.6) that

Hence the term in (2.1) is bounded by

where the second term is 0 since and . Now letting , we obtain (2.1). ∎

Figure 1 (right) shows and from a realization of the process (3.4) with . Here, and (.90,.92,.95,.97) quantiles. The dots are and the dashed lines are with different . The ESE with and are close to the extremogram for all observed distances while the ESE with and quantiles decay faster for the observed distances greater than 3. The two horizontal lines are 95% confidence bands based on random permutations.

3.2 Brown-Resnick process

We begin with the definition of the Brown-Resnick process with Fréchet marginals. Details can be found in kabluchko or christina. Consider a stationary Gaussian process with mean 0 and variance 1 and use to denote independent replications of . For the correlation function , assume that there exist sequences such that

Then, the random fields defined by

(3.8)

converge weakly in the space of continuous function to the stationary Brown-Resnick process

(3.9)

where is an increasing enumeration of a unit rate Poisson process, are iid sequences of random fields independent of , and are independent replications of a Gaussian random field with stationary increments, and covariance function by . Here, is the cumulative distribution function of .

Figure 2: from a realization of Brown-Resnick process on lattice (left) and non-lattice (right). For lattice case, the ESE with upper quantiles are presented. For non-lattice case, the ESE with different bandwidths, with and , are displayed. Two horizontal lines are 95% random permutation confidence bands.

The extremogram for the Brown-Resnick process with and is

(3.10)

where . To see (3.10), recall from husler that

As , we assume without loss of generality that . Then we have and

(3.11)

which proves (3.10).

Similar to Lemma 2 in christina, -mixing coefficient of the process is bounded by

(3.12)

In the following examples, the correlation function of a Gaussian process is assumed to have an expansion around zero as

(3.13)

where and . For this choice of correlation function, we have as mentioned in christina, Remark 1.

Example 3.3.

Consider the Brown-Resnick process with for and . The conditions of Theorem 2.1 hold if and . In this case, (2.6) is not satisfied for .

Proof.

From (3.12), we have . If , (2.2) holds since

Similarly, (2.3) can be checked. For (2.4), Proposition 3.1 implies that