State space models for non-stationary intermittently coupled systems

State space models for non-stationary intermittently coupled systems

Philip G. Sansom, Daniel B. Williamson, David B. Stephenson
University of Exeter, Exeter, United Kingdom.
Abstract

Many time series exhibit non-stationary behaviour that can be explained by intermittent coupling between the observed system and one or more unobserved drivers. We develop Bayesian state space methods for modelling changes to the mean level or temporal correlation structure due to intermittent coupling. Improved system diagnostics and prediction are achieved by incorporating expert knowledge for both the observed and driver processes. Time-varying autoregressive residual processes are developed to model changes in the temporal correlation structure. Efficient filtering and smoothing methods are proposed for the resulting class of models. We develop methodology for evaluating the evidence of unobserved drivers, and for quantifying their overall effect, and their effects during individual events. Methods for evaluating the potential for skilful predictions within coupled periods, and in future coupled periods, are also proposed.

The proposed methodology is applied to the study of winter variability in the dominant pattern of climate variation in the northern hemisphere, the North Atlantic Oscillation. Over 70% of the inter-annual variance in the winter mean level is attributable to an unobserved driver. Skilful predictions for the remainder of the winter season are possible as soon as 30 days after the beginning of the coupled period.

1 Introduction

Intermittently coupled systems can be found in many areas of both the natural and social sciences. For example, many climate processes are only active during certain times of year, e.g., sea ice and snow cover change the interaction between the surface and the atmosphere (Chapin III et al., 2010; Bourassa et al., 2013). Migrating birds and animals only mix at certain times of year, allowing disease transmission between populations(Olsen et al., 2006; Altizer et al., 2011). Empirical models have been applied to forecasting intermittent demand in production economics and operational research (Croston, 1972; Shenstone and Hyndman, 2005). In many cases, it is impossible or impractical to observe or even to physically identify the intermittent component of the system. However, physical reasoning or prior knowledge may support the existence of drivers, and provide information about their behaviour and the timing and duration of coupling, that can be incorporated into time series models. We refer to these unobserved, intermittently coupled components as drivers to distinguish them from explanatory or exogenous variables that are usually observed.

Many natural or artificial systems exhibit non-stationary behaviour, even without intermittent coupling. In coupled systems, we are interested in how much of the variation between coupled periods is attributable to the effect of drivers, long-term variations in the observed process itself, and accumulated short-term variability. The effect of drivers on the observed process will usually be transient, however the drivers themselves may persist in similar states between coupling events. If the drivers are persistent, then learning their states should provide predictability for future coupled periods. If we can learn the states of the drivers quickly enough, then it should be possible to make predictions about the remainder of the coupled period, even when the drivers do not persist between coupling events.

State space models provide a flexible class of models for non-stationary time series (Durbin and Koopman, 2012). An observed process is modelled as dependent on a sequence of latent variables called the state that represent the development of the system under study over time. The state vector typically includes parameters representing the mean level and any periodic components of the system, as well as autoregressive and other behaviours (Harvey, 1989). This representation of a system in terms of physically meaningful components allows us to incorporate prior and physical knowledge in order to separate the effects of drivers from any long-term variability elsewhere in the system. Changes in the behaviour of a system over time (due to drivers or otherwise) can be broadly classified into one of three types: a change in the mean level, a change in the short-term temporal dependence, or a change in the variability of the process. We focus on changes in the mean level and temporal dependence.

There is an extensive literature on modelling non-stationarity in the mean level by state space methods, particularly where the observed process depends linearly on the state parameters and the unexplained variation in both follows a normal distribution (Harvey, 1989; West and Harrison, 1997; Durbin and Koopman, 2012). Time-varying autoregressive (TVAR) models generalise classical autoregressive models to have time-varying coefficients, thus capturing changes in temporal dependence (Subba Rao, 1970; Kitagawa and Gersch, 1985; Prado and West, 1997, 2010). State space TVAR models have been applied to a range of applications in epidemiology, economics, signal processing among other disciplines (e.g., Farah et al., 2014; Schäck and Muma, 2016; Abbate and Marcellino, 2017; Goh et al., 2017; Li et al., 2017). However, TVAR models are hard to interpret when the mean of the process is not zero. Additional parameters representing the mean (possibly time-varying) can be included in the model, but like classical autoregressions, they will not be on the same scale as the observed process. This makes it difficult to specify physically meaningful models for the time development of the mean level, or the effect of unobserved drivers on the mean level. In Section 2, we propose a class of models containing residual TVAR processes that capture changes in short-term temporal dependence while maintaining the interpretability of the mean level and potential driver effects.

Smooth changes in either the mean level or the temporal dependence structure can be captured by simple random walk priors on their respective state variables. Rapid changes, such as those that might be expected due to intermittent coupling, often require explicit interventions in the model Box and Tiao (1975). Intervention methods were extended to state space models by West and Harrison (1989) and more recently to GARCH processes by Fokianos and Fried (2010) and sequential Monte Carlo methods by Heard and Turcotte (2017). Interventions usually take the form of an unobserved parameter that only enters the model for a pre-specified amount of time. While the exact form of the intervention varies, the associated parameters are usually assumed to be fixed for the length of the intervention. If multiple interventions are required in a single time series, then independent parameters are estimated for each intervention. A more flexible approach is required to capture persistence of the driver states between coupling events and evolution during of the drivers whilst coupled. In Section 3, we model unobserved drivers as dynamic latent processes, identifiable through the incorporation of expert judgement about the timing and influence of coupling. Efficient filtering and smoothing methods for the resulting class of models are proposed in Section 4.

Following the technical developments outlined above, we discuss inference and prediction in intermittently coupled systems in Section 5. The proposed methodology is demonstrated by application to diagnosing excess winter time variability in the dominant mode of climate variability in the northern hemisphere, known as the North Atlantic Oscillation (NAO). Section 6 describes the NAO data and our model for excess winter time variability, followed by the results of posterior inference and predictability studies for the NAO. Section 7 concludes with a discussion.

2 Residual TVAR component models

A univariate linear normal TVAR model with non-zero mean has the general form

where is a -dimensional vector of parameters representing the mean level of the observed process at time , and is a -dimensional vector of time-varying auto-regressive coefficients. The matrices , and are assumed known with dimensions , , and respectively. The residual represents any variation in the observed process between times and that is not explained by the mean or autoregressive components. The and -dimensional innovations and represent the unexplained variation in and respectively between times and . For notational simplicity we restrict ourselves to the homogeneous case where , , , , and do not depend on time. The extension to the inhomogeneous case is immediate.

The mean vector contains parameters representing the systematic components of the system, e.g., mean level, long-term trends, periodic components, seasonality. The components of the mean vector can be non-stationary, but are expected to vary only slowly over time. Note that the linear combination cannot be directly interpreted as the mean level of the observed process , since

This makes it difficult to interpret , and to specify our beliefs about its evolution through the matrices and , and our prior beliefs about its state at time .

In order to improve the interpretability of the mean component, we propose the following TVAR component structure

(1a)
(1b)
(1c)
(1d)

The independent residual represents observation or measurement error in the system under study. The structured residual represents short time-scale variability in the observed process and is modelled as a residual TVAR process. The innovation represents the variation in between times and that is not explained by the autoregressive structure. The model proposed in (1) allows the mean component to be interpreted on the natural scale of the observed process, i.e., , provided that the residual TVAR process is stationary. However, standard methods for posterior inference in normal state space models do not apply due to the non-linear combination of and . Approximate filtering and smoothing methods for residual TVAR models are proposed in Section 4.

3 Modelling intermittent coupling

We define an intermittently-coupled system as one where the observed process is influenced by one or more often unobserved drivers, but only at particular times. Suppose the drivers are believed to influence the mean level of the observed process . In general, the effect of coupling on the observed process is expected to be transient, i.e., we expect the observed process to return, either gradually or immediately, to approximately its former mean level once the influence of the drivers is removed. We model this as

(2)

where a -vector containing the state of the drivers at time , and is a -vector of regression coefficients. When then the drivers influence the mean level of the observed process . However, when we have .

Suppose the drivers were known (observed), then (2) is a dynamic regression model (Chapter 9.2 West and Harrison, 1997). The dynamic coefficients quantify the time-varying influence of on , and standard methods for state space models allow us to learn about how varies over time. When the drivers are unknown, we can make progress by exploiting prior knowledge about the timing of coupling events. The regression coefficients are assumed known, and take the form of weights where the th element of is given by:

(3)

where . The weights represent our beliefs about when, and how closely, the observed process and the drivers are coupled. Given , we model on the natural scale of the observed process as

and (4)

where and are known matrix-valued functions of time. For the purpose of posterior computation, the driver is now included in the state vector , together with , and . We can learn about using state space methods, in the same way as the other parameters , , .

A wide variety of behaviours can be modelled through the weights , the evolution function , and the variance function , depending on our beliefs about the drivers and how and when they couple to the observed process . Note that the ability to interpret as the expectation of the observed process is central to our ability to specify meaningful beliefs about on a natural scale. It may be difficult to justify specifying complex time-varying forms for the evolution and variance functions and , given that the represent the time evolution of unobserved processes about which we have limited information. Most of our prior knowledge is likely to be about the timing of coupling events, and this is expressed primarily through the weights .

The approach described above extends naturally to modelling the influence of unobserved drivers on the temporal dependence structure represented by the autoregressive coefficients.

4 State space methods for intermittently coupled systems

The model proposed in Sections 2 and 3 can be written in state space form as

(5a)
(5b)

for with , where and . The observation matrix is given by where is the -vector of zeros. The evolution function is defined by (1b), (1c), (1d) and (4). The innovation variance is block diagonal with main diagonal blocks , , and .

The basic operations on state space models are filtering and smoothing. Filtering refers to sequentially updating our beliefs about the state vector as new data arrive. This leads to the sequence of posterior distributions . Smoothing refers to sampling from the joint distribution of the state parameters given all the data and evaluating the associated marginals . Exact analytic recursions are available for filtering and smoothing in normal linear state space models (Chapter 4 West and Harrison, 1997). However, the evolution function in (5) is non-linear due to the combination of and in (1c). Exact analytic expressions for filtering and smoothing are not generally possible for non-linear or non-normal models. Sequential Monte Carlo methods, also known as particle filters, provide tools for filtering and smoothing in general non-linear and non-normal models. However, particle filters are computationally expensive, and perform poorly for datasets with large such as the NAO index analysed in Sections 6 (Doucet and Johansen, 2011).

Instead, we propose a linear approximation to the state equation (5b)

(6)

where and . Equation (6) leads to approximate filtering and smoothing recursions, detailed in Appendix A. In general, the innovation variance will be small, and our uncertainty about will be only weakly correlated with the other state variables , and , and decrease rapidly with increasing . Since the other components of the evolution function are linear and the observation errors and the joint state innovation are normal, filtering and smoothing based on (6) should be very accurate. Approximations of this type are frequently used as the basis for proposal distributions in sequential Monte Carlo schemes for other classes of model(Doucet et al., 2000). In its most general form, the approximation proposed in (6) leads to the Extended Kalman Filter (Anderson and Moore, 1979).

5 Posterior Inference

Is there evidence of unobserved drivers?

Given data , we want to evaluate the evidence in favour of the existence of intermittently coupled unobserved drivers . We do this by evaluating the Bayes’ factor

(7)

where is the full model including unobserved drivers , and is an alternative model without unobserved drivers. The Bayes’ factor summarises the evidence in favour of unobserved drivers as a ratio of likelihoods (Kass and Raftery, 1995). The class of models for intermittent coupling proposed in Section 3 explicitly separates the effects of drivers from the background mean and autoregressive processes. Therefore, our judgements about the fixed parameters , , , , , and should be the same with or without drivers. The natural alternative model is then the model without observed drivers given by (1). The likelihoods and can be decomposed as

(8)

The filtering recursions in Appendix A include an analytic approximation to so the likelihood (8) and hence the Bayes’ factor (7) can be easily evaluated.

What is the effect of the unobserved drivers?

If the Bayes’ factor indicates evidence of the existence of unobserved drivers , then we can use our model to estimate their effect during individual coupled periods. The backward sampling recursions in Appendix A lead to the set of samples of the joint posterior , for . From the posterior samples we can make inferences about any function of the state variables for any time period of interest.

The relative contributions of the mean component , the structural residual and the drivers to the variability between coupled periods is of particular interest. The mean of the structural residual and driver components during the th coupled period in the posterior sample is

(9)

where is the number of time steps in . The prior expectation of the structural residual and driver components during any period is zero by (1c) and (4), i.e., and for all . In general , so for estimating the contribution of the mean component it will be helpful to remove the sample mean contribution over all coupled periods so that

(10)

is the anomaly in the mean component during period in sample . We can then form summaries of the contribution of each component during a particular coupled period. In particular, the means over all samples

(11)

summarise the posterior expected contribution of each component to the observed level during period .

It would also be helpful to summarise the relative contributions of each component over all coupled periods. We propose performing an analysis of variance of the observed means using the three component means in (9) and (10) as explanatory variables. The analysis of variance leads to four sums-of-squares for each sample , corresponding to the sum of squared deviations explained by the mean component , short-term variability , the drivers and observation errors in each sample trajectory. Posterior summaries over the samples summarise the overall contributions of each component to the variability between coupled periods.

Uniquely compared to standard intervention models, our method does not assume that the drivers are constant throughout each coupled period. Therefore, we can also estimate the evolution of the drivers compared to the other components within individual coupled periods from the posterior samples . While the mean component will usually be assumed to evolve slowly, it may include rapidly changing systematic components, e.g., to represent seasonality in the data. It can be helpful to remove the effect of any rapidly changing systematic components from the data when examining behaviour within coupled periods by removing the posterior expectation of the mean component. Adjusted data are denoted given by

(12)

where is estimated by the mean of the posterior samples .

Can we make predictions using unobserved drivers?

Knowledge of unobserved drivers should provide useful predictability within coupled periods. The model proposed in Section 3 also allows for dependence between successive coupled periods, so knowledge of the driver during one coupled period may also be useful for predicting the next. Our knowledge of the unobserved driver at time given all the data up to that time is summarised by the sequence of posterior distributions obtained from the filtering recursions. Plotting the evolution of the moments of the filtered posterior for will reveal how quickly we can learn about the driver during each coupled period, and the level of persistence between coupled periods. The -step ahead forecast distribution given data up to time can be sampled exactly using the recursions in Appendix A. The correlation between the data and the forecast means provides a simple measure of forecast performance.

6 Application to the North Atlantic Oscillation

We apply the methodology developed in the previous sections to the diagnosis of excess winter variability in the North Atlantic Oscillation (NAO). The NAO describes variations in the pressure difference between the North Atlantic sub-tropical high and polar low, which is strongly associated with the strength of the westerly flow and behaviour of storms (Walker, 1924; Hurrell, 1995). Because of its impact on European climate, the ability to forecast the NAO is currently a topic of great interest for the development of new climate prediction services (Siegert et al., 2016).

Following Mosedale et al. (2006), we construct a simple NAO index as the area-weighted sea level pressure difference between two boxes, one stretching from N, the other from N, both spanning W–, using pressure data from the ERA-Interim reanalysis (Dee et al., 2011). The resulting daily time series, shown in Fig. 1, spans the period 1 January 1979 to 31 March 2015, a total of observations. Figures 1 and 1 reveal that there is much more inter-annual variation in winter than in summer (Folland et al., 2009; Keeley et al., 2009). Climate scientists believe that the NAO may experience seasonally-dependent modulation by slower drivers such as the ocean (Visbeck et al., 2003). Using an empirical ANOVA approach, Keeley et al. (2009) estimated that up to 70% of inter-annual variance in the December-January-February mean NAO was due to external forcing (i.e., a driver).

\phantomsubcaption\phantomsubcaption\phantomsubcaption

Figure 1: The North Atlantic Oscilliation. (\subreffig:nao-ts) Daily time series (grey) and 90 day moving average (black) of our NAO index, and annual time series of (\subreffig:nao-jja) Summer (June-July-August) mean NAO, and (\subreffig:nao-djf) Winter (December-January-February) mean NAO.

Modelling the NAO

Our full model for the NAO, including components for trend and seasonality, is detailed in Appendix B. We assume the existence of a single unobserved driver . Our beliefs about the unknown driver are specified through the weights , the mean function and the variance function . The weighting function is shown in Fig. 2. For comparability with Keeley et al. (2009) we set the weight during December, January and February and elsewhere. However, other studies have found also evidence of oceanic forcing of the winter NAO during November and March, e.g., (Cassou et al., 2007). To allow for this, we let the weights linearly increase (decrease) from 0 (1) to 1 (0) during November (March). The lag-1 autocorrelation between successive winter NAO states was estimated at 0.15 by Stephenson et al. (2000). Not all of the correlation between successive winters will be due to the driver. Therefore, we set the mean function for all , i.e., we model as a stationary AR(1) process. Fig. 1 suggests that the winter mean NAO varies by up to 10 hPa between successive years. The variance of a stationary AR(1) process with coefficient and innovation variance is . Therefore, assuming a range of approximately two standard deviations, we set for all .

\phantomsubcaption\phantomsubcaption

Figure 2: NAO driver structure. (\subreffig:lambda) The prescribed weight ,and (\subreffig:driver-prior) the resulting posterior variance , of the unobserved driver .

We also need to specify our prior beliefs about the state vector at time , and the observation and state variances and . Details of the prior modelling for the state vector at time are given in the supplementary material. The observation variance and all but two components of the innovation variance are assumed constant, and chosen based on computer simulation (see supplementary material for details). The TVAR innovation and driver variances and were assessed as having the greatest impact on model fit since they determine the day-to-day variability and the winter mean driver variability respectively. They are also several orders of magnitude larger than the other innovation variances. Therefore, the driver variance and the residual TVAR innovation variance were fixed at the maximum a posteriori probability estimates obtained by numerical optimisation using uniform priors.

Diagnosing NAO drivers

Figure 3: Contribution of individual model components. Posterior mean estimates of the winter (December-January-February) mean levels of the mean component (white), the residual TVAR component (light grey), and the driver component (dark grey).

Models with and without an unobserved driver were fitted using the filtering recursions in Appendix (A. The same priors and innovation variances were used for each, with the exception of which was estimated separately. Analysis of the one-step ahead forecast residuals from the filtering procedure (see Appendix (A), revealed no obvious problems with either model. The log-Bayes’ factor in favour of the existence of a driver was calculated as 40.4. This represents overwhelming evidence in favour of the existence of an unobserved driver similar to the form specified.

Figure 4: Analysis of variance. Fraction of inter-annual variability in the winter (December-January-February) mean NAO explained by each model component. The shaded box represents the quartiles of the posterior estimate of the variance explained, the whiskers represent an equal-tailed 90% credible interval.

In order to assess the contribution of the driver to the NAO during each individual winter, we sample 1000 plausible trajectories for the state vector by backward sampling. From the samples we compute the summary statistics in (11) for each contiguous winter period (December-January-February). Figure 3 shows the posterior mean contribution of each component to each observed winter mean level. As expected, the contribution of the mean component is generally small compared to the driver and residual TVAR components. The mean component shows evidence of a weak trend from positive to negative contributions over the study period. The mean level of the NAO appears to peak around 1990, then decreases until around 2010, after which it begins to rise again. The driver and TVAR components have the same sign in most years. However, the contribution of the driver component is generally larger than that of the TVAR component .

Figure 4 shows the results of the analysis of variance of the observed winter means . The driver component explains 74% of the observed variation in the winter means with a 90% equal-tailed credible interval (C.I.) of 56–91%. Accumulated short-term variability captured by the TVAR residuals explains 22% of the inter-annual variability (90% C.I. 5–39%). The mean component explains only 4% of the inter-annual variability (90% C.I. 0–13%). The contribution of measurement error is negligible, as expected given that we are averaging pressure such large areas. Our estimate of the driver contribution is slightly larger than that of Keeley et al. (2009), despite our model including additional sources of variability in the mean and measurement error components.

A sensitivity analysis was conducted to evaluate the choices made for the innovation variances and . The analysis of variance was insensitive to doubling or halving any of the translation distances on which the chosen innovation variances were based (see supplementary material for details). The likelihoods favoured lower values of all of the innovation variances, indicating that the chosen values were not overly restrictive.

By comparing deseasonalised data (12) to posterior estimates of the driver we can investigate the evolution of the driver within individual coupling events. Fig. 5 compares the deseasonalised data with posterior estimates of the driver for four of the most extreme NAO winters in the last 30 years. Different behaviour is visible in different years. In 2009–10 the observed data are strongly negative during December, January and February, but about average during November and March. In 1992–3 (Fig. 5) and 1995–6 (Fig. 5), the data appear to wander around a fairly constant but non-zero mean level throughout the whole extended winter. In 1988–9 (Fig. 5) the level of the data gradually increases during November and December before levelling out. The dynamic latent process used to model the unobserved driver allows our model to adapt to such variations in behaviour between different coupled periods.

\phantomsubcaption\phantomsubcaption\phantomsubcaption\phantomsubcaption

Figure 5: Driver evolution. Deseasonalised observations (black) and posterior samples of the driver (grey) for the extended winter of (\subreffig:1988) 1988–9, (\subreffig:1992) 1992–3, (\subreffig:1995) 1995-6, and (\subreffig:2009) 2009–10.

Forecasting the winter NAO

\phantomsubcaption\phantomsubcaption

Figure 6: Learning about unobserved drivers. The filtered posterior (\subreffig:prior-mean) mean, and (\subreffig:prior-var) variance of the driver . Each trajectory represents a different year between 1985 and 2015.

If we can learn about the driver level early enough during the coupled period, then we can use that knowledge to provide more skilful forecasts for the rest of the period. Similarly, if the driver state persists between coupled periods then we can predict the next coupled period given knowledge of the last. Fig. 6 shows the posterior expectation and variance of the driver component obtained from the filtering procedure, i.e., . The expectation of is close to at the end of October, changes rapidly during November and stabilises sometime during December. Thus the current winter provides little information about the next, consistent with the low autocorrelation specified for the driver . Our uncertainty about the driver level peaks at the end of October, when the NAO has been decoupled from the driver for several months. Uncertainty then falls rapidly during November, and starts to flatten out sometime in December as the influence of the driver increases. This suggests that we may be able to provide meaningful forecasts of the remaining winter mean NAO as early as the beginning of December.

\phantomsubcaption\phantomsubcaption

Figure 7: Predictability based on unobserved driver. (\subreffig:corr) The correlation between the observations and forecasts with no driver (black) and forecasts including an unobserved driver (grey) initialised on each day of the coupled period, for the mean level over the remainder of the coupled period up to 28 February (\subreffig:pred) Observations (x), forecasts with no driver (black), and forecasts with an unobserved driver (grey) for the mean NAO between 1 December and 28 February each year, given data up to 30 November.

Using the forecasting recursions in Appendix A, we obtained forecasts beginning each day from 1 November to 1 February until the end of the fully coupled period on 28 February for every winter between 1985 and 2014. Fig. 7 shows the correlation over the sample of 30 winters between the forecast means and observation means for the same periods. By the beginning of December, the correlation approaches 0.5 for the 90 day forecast of the mean NAO to 28 February using the model including an unobserved driver. This correlation approaches that achieved by computationally expensive numerical weather prediction models (Scaife et al., 2014; Siegert et al., 2016). The correlation of forecasts from the model without a driver increases gradually until it approaches the model including a driver sometime in January. This is a result of the forecast period becoming shorter. As the forecast period decreases we are essentially predicting weather noise, and so neither model has an advantage.

Fig. 7 compares 90 day forecasts made on 1 December for models with and without an unobserved driver. The forecasts from the model with no driver exhibit very little variation from year to year. In contrast, the forecasts from the model with a driver are successful in predicting some of the variations in the observed winter mean level. The forecasts in Fig. 7 fail to predict the extreme winter of 2009–10. However, in Fig. 5 the deseasonalised data do not become strongly negative until around the end of the first week in December. Since the forecast in Fig. 7 was based on information up to 30 November, it is therefore unsurprising that a fairly normal winter was forecast.

7 Discussion

In this study we have developed Bayesian state space methods for diagnosing intermittently coupled systems. Residual TVAR models are proposed to capture non-stationarity in the temporal dependence structure and improve the interpretability of the mean component and driver effects. Unobserved drivers are modelled as dynamic latent processes incorporating expert knowledge to separate the effects of coupling from those of non-stationarity elsewhere in the system. A linearised approximation is proposed that allows efficient filtering and smoothing for the resulting class of models, without having to develop complicated and expensive sequential Monte Carlo methods. In addition, we develop tools for posterior inference in intermittently coupled systems, including evaluating the evidence of a driver, attribution of historical variation in the system, and demonstrating potential predictability.

In addition to addressing issues of interpretability and non-stationarity, residual TVAR components also permit efficient recursive estimation of the autoregressive parameters without the need for numerical optimisation or MCMC approximation. The linearised approximation proposed in Section 4 can be extended to non-normal models and models with additional non-linearities in either the state or observation functions. However, if an exact solution is required, or the proposed model is significantly non-linear or non-normal, then Sequential Monte Carlo methods can be used (Doucet and Johansen, 2011).

The model proposed for intermittently coupled systems differs from standard intervention analysis by treating the unobserved driver as a continuous process, rather than a series of independent events. This allows knowledge gained during one coupled period to inform inferences for the next. By modelling the driver as a continuous process, the driver level is able to vary within individual coupled periods. This means we can specify a more flexible form for the driver if we are unsure of the exact nature of the interaction.

Modelling the unobserved drivers as dynamic latent processes provides some robustness to misspecification of the coupling period. However, the need to specify a fixed coupling period remains a limitation. The model is not identifiable if we assume the system is permanently coupled. Methods for changepoint analysis may provide a useful avenue of extension to unknown onset time and length of coupled period. Recent advances in multiple changepoint analysis allow for dependence between regimes of unknown onset time and length. Koop and Potter (2007) propose an offline Markov Chain Monte Carlo approach that can be updated using Sequential Monte Carlo methods as new data arrive. The online Bayesian changepoint methods proposed by Fearnhead and Liu (2011) might provide a more efficient approach, but may struggle to identify gradual changes.

In the methodology developed here, we have allowed for non-stationarity in the mean and the temporal dependence structure, but not in the variance. Stochastic volatility models and related ARCH and GARCH models have been widely studied and applied, particularly in economics. Masala (2015) applied a GARCH model to stochastic modelling of the NAO, but found that its performance was poor. Efficient filtering and smoothing is possible for time-varying observation error variance (West and Harrison, 1997, Chapter 10.8). However, fully conjugate models that admit analytic filtering and smoothing for time-varying innovation variance are not possible, even in the linear normal case. Of particular interest are changes in the residual TVAR innovation variance that drives short-term structured variability in the system. Sequential Monte Carlo methods or further approximations are required to model time-varying innovation variances.

The authors gratefully acknowledge the support of the Natural Environment Research Council grant NE/M006123/1.

Appendix A Filtering, smoothing and forecasting

Filtering

The sequence of posterior distributions can be approximated as follows:

Prior to observing , the predictive distributions at time are

with

where

After observing , the posterior distribution of the state vector at time is

with

where and .

The one-step ahead forecast residuals are useful for model checking and defined as

Smoothing

The joint posterior can be sampled recursively as follows:

  • Sample from

  • for

    • Sample from

where

and . The quantities ,,, and are obtained from the filtering recursions.

Forecasting

The -step ahead forecast distribution given data up to time can be sampled sequentially as

  • Sample from

  • for

    • Sample from

    • Sample from .

Appendix B Full model for NAO application

Inspection of the sample periodogram suggests the inclusion of a periodic component in the mean containing annual and semi-annual harmonics. Analysis of the sample autocorrelation and partial-autocorrelation functions of the deseasonalised data suggest a residual TVAR component of order . Examination of the variability of the first differences reveals a strong annual cycle in the day-on-day variation of the NAO. We model this cycle explicitly through a variance function on the TVAR innovations .

The full model used for the analysis of the NAO is summarised below

(13a)
(13b)
(13c)
(13d)
(13e)
(13f)
(13g)
(13h)

where and

The hyperparameters , and are fixed at the maximum a posteriori estimates obtained by numerical optimisation using uniform priors.

In the notation of Section 2 the mean vector is . The parameter represents the background mean level of the NAO. Any local systematic trend is represented by the parameter . The periodic component is represented by the parameters and . The mean parameters themselves assumed to be non-stationary with fixed innovation variances and . See supplementary information for details of choices for the innovation variances.).

References

  • Abbate and Marcellino (2017) Abbate, A. and Marcellino, M. (2017) Point, interval and density forecasts of exchange rates with time varying parameter models. J. R. Statist. Soc. A.
  • Altizer et al. (2011) Altizer, S., Bartel, R. and Han, B. A. (2011) Animal migration and infectious disease risk. Science, 331, 296–302.
  • Anderson and Moore (1979) Anderson, B. D. O. and Moore, J. B. (1979) Optimal Filtering. Prentice-Hall.
  • Bourassa et al. (2013) Bourassa, M. A., Gille, S. T., Bitz, C., Carlson, D., Cerovecki, I., Clayson, C. A., Cronin, M. F., Drennan, W. M., Fairall, C. W., Hoffman, R. N., Magnusdottir, G., Pinker, R. T., Renfrew, I. A., Serreze, M. C., Speer, K., Talley, L. D. and Wick, G. A. (2013) High-latitude ocean and sea ice surface fluxes: Challenges for climate research. Bulletin of the American Meteorological Society, 94, 403–423.
  • Box and Tiao (1975) Box, G. E. P. and Tiao, G. C. (1975) Intervention Analysis with Applications to Economic and Environmental Problems. J. Am. Statist. Ass., 70, 70–79.
  • Cassou et al. (2007) Cassou, C., Deser, C. and Alexander, M. A. (2007) Investigating the impact of reemerging sea surface temperature anomalies on the winter atmospheric circulation over the North Atlantic. Journal of Climate, 20, 3510–3526.
  • Chapin III et al. (2010) Chapin III, F. S., Sturm, M., Serreze, M. C., McFadden, J. P., Key, J. R., Lloyd, A. H., McGuire, A. D., Rupp, T. S., Lynch, A. H., Schimel, J. P., Beringer, J., Chapman, W. L., Epstein, H. E., Euskirchen, E. S., Hinzman, L. D., Jia, G., Ping, C.-L., Tape, K. D., Thompson, C. D. C., Walker, D. A. and Welker, J. M. (2010) Role of Land-Surface Changes in Arctic Summer Warming. Science, 657, 9–13.
  • Croston (1972) Croston, J. D. (1972) Forecasting and Stock Control for Intermittent Demands. Operational Research Quarterly, 23, 289–303.
  • Dee et al. (2011) Dee, D. P., Uppala, S. M., Simmons, A. J., Berrisford, P., Poli, P., Kobayashi, S., Andrae, U., Balmaseda, M. A., Balsamo, G., Bauer, P., Bechtold, P., Beljaars, A. C. M., van de Berg, L., Bidlot, J., Bormann, N., Delsol, C., Dragani, R., Fuentes, M., Geer, A. J., Haimberger, L., Healy, S. B., Hersbach, H., Hólm, E. V., Isaksen, L., Kållberg, P., Köhler, M., Matricardi, M., Mcnally, A. P., Monge-Sanz, B. M., Morcrette, J. J., Park, B. K., Peubey, C., de Rosnay, P., Tavolato, C., Thépaut, J. N. and Vitart, F. (2011) The ERA-Interim reanalysis: Configuration and performance of the data assimilation system. Quarterly Journal of the Royal Meteorological Society, 137, 553–597.
  • Doucet et al. (2000) Doucet, A., Godsill, S. J. and Andrieu, C. (2000) On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10, 197–208.
  • Doucet and Johansen (2011) Doucet, A. and Johansen, A. M. (2011) A Tutorial on Particle Filtering and Smoothing: Fifteen years later. In The Oxford Handbook of Nonlinear Filtering (eds. D. Crisan and B. Rozovskii), chap. 8.2, 656–705. Oxford University Press.
  • Durbin and Koopman (2012) Durbin, J. and Koopman, S. J. (2012) Time Series Analysis by State Space Methods. Oxford University Press, 2nd edn.
  • Farah et al. (2014) Farah, M., Birrell, P., Conti, S. and Angelis, D. D. (2014) Bayesian Emulation and Calibration of a Dynamic Epidemic Model for A/H1N1 Influenza. J. Am. Statist. Ass., 109, 1398–1411.
  • Fearnhead and Liu (2011) Fearnhead, P. and Liu, Z. (2011) Efficient Bayesian analysis of multiple changepoint models with dependence across segments. Statistics and Computing, 21, 217–229.
  • Fokianos and Fried (2010) Fokianos, K. and Fried, R. (2010) Interventions in INGARCH processes. Journal of Time Series Analysis, 31, 210–225.
  • Folland et al. (2009) Folland, C. K., Knight, J. R., Linderholm, H. W., Fereday, D., Ineson, S. and Hurrell, J. W. (2009) The Summer North Atlantic Oscillation: Past, Present, and Future. Journal of Climate, 22, 1082–1103.
  • Goh et al. (2017) Goh, Y.-H., Goh, Y.-L., Lee, Y.-K. and Ko, Y.-H. (2017) Robust speech recognition system using multi-parameter bidirectional Kalman filte. International Journal of Speech Technology.
  • Harvey (1989) Harvey, A. C. (1989) Forecasting, structural time series models and the Kalman filter. Cambridge University Press.
  • Heard and Turcotte (2017) Heard, N. A. and Turcotte, M. J. M. (2017) Adaptive Sequential Monte Carlo for Multiple Changepoint Analysis. Journal of Computational and Graphical Statistics, 26, 414–423.
  • Hurrell (1995) Hurrell, J. W. (1995) Decadal Trends in the North Atlantic Oscillation: Regional Temperatures and Precipitation. Science, 269, 676–679.
  • Kass and Raftery (1995) Kass, R. E. and Raftery, A. E. (1995) Bayes Factors. J. Am. Statist. Ass., 90, 773–795.
  • Keeley et al. (2009) Keeley, S. P. E., Sutton, R. T. and Shaffrey, L. C. (2009) Does the North Atlantic Oscillation show unusual persistence on intraseasonal timescales? Geophysical Research Letters, 36, L22706.
  • Kitagawa and Gersch (1985) Kitagawa, G. and Gersch, W. (1985) A Smoothness Priors Time-Varying AR Coefficient Modeling of Nonstationary Covariance Time Series. IEEE Transactions on Automatic Control, 30, 48–56.
  • Koop and Potter (2007) Koop, G. and Potter, S. (2007) Estimation and Forecasting in Models with Multiple Breaks. The Review of Economic Studies, 74, 763–789.
  • Li et al. (2017) Li, Y., Cui, W.-G., Luo, M.-L., Li, K. and Wang, L. (2017) High-resolution time-frequency representation of EEG data using multi-scale wavelets. International Journal of Systems Science, 7721, 1–11.
  • Masala (2015) Masala, G. (2015) North Atlantic Oscillation index stochastic modelling. International Journal of Climatology, 35, 3624–3632.
  • Mosedale et al. (2006) Mosedale, T. J., Stephenson, D. B., Collins, M. and Mills, T. C. (2006) Granger causality of coupled climate processes: Ocean feedback on the North Atlantic Oscillation. Journal of Climate, 19, 1182–1194.
  • Olsen et al. (2006) Olsen, B., Munster, V. J., Wallensten, A., Waldenström, J., Osterhaus, A. D. M. E. and Fouchier, R. A. M. (2006) Global patterns of Influenza A Virus in wild birds. Science, 312, 384–388.
  • Prado and West (1997) Prado, R. and West, M. (1997) Exploratory Modelling of Multiple Non-Stationary Time Series: Latent Process Structure and Decompositions. In Modelling Longitudinal and Spatially Correlated Data (eds. T. G. Gregoire, D. R. Brillinger, P. J. Diggle, E. Russek-Cohen, W. G. Warren and R. D. Wolfinger), 349–361. New York, NY: Springer New York.
  • Prado and West (2010) Prado, R. and West, M. (2010) Time Series: Modelling, Computation and Inference. Chapman and Hall/CRC.
  • Scaife et al. (2014) Scaife, A. A., Arribas, A., Blockey, E., Brookshaw, A., Clark, R. T., Dunstone, N. J., Eade, R., Fereday, D., Folland, C. K., Gordon, M., Hermanson, L., Knight, J. R., Lea, D. J., MacLachlan, C., Maidens, A., Martin, M., Peterson, A. K., Smith, D. M., Vellinga, M., Wallace, E., Waters, J. and Williams, A. (2014) Skillful long range prediction of European and North American winters. Geophysical Research Letters, 5, 2514–2519.
  • Schäck and Muma (2016) Schäck, T. and Muma, M. (2016) Robust Nonlinear Causality Analysis of Non-Stationary Multivariate Physiological Time Series. IEEE Transactions on Biomedical Engineering, PP, 1–14.
  • Shenstone and Hyndman (2005) Shenstone, L. and Hyndman, R. J. (2005) Stochastic models underlying Croston’s method for intermittent demand forecasting. Journal of Forecasting, 24, 389–402.
  • Siegert et al. (2016) Siegert, S., Stephenson, D. B., Sansom, P. G., Scaife, A. A., Eade, R. and Arribas, A. (2016) A Bayesian framework for verification and recalibration of ensemble forecasts: How uncertain is NAO predictability? Journal of Climate, 29, 995–1012.
  • Stephenson et al. (2000) Stephenson, D. B., Pavan, V. and Bojariu, R. (2000) Is the North Atlantic Oscillation a random walk? International Journal of Climatology, 20, 1–18.
  • Subba Rao (1970) Subba Rao, T. (1970) The Fitting of Non-Stationary Time-Series Models with Time-Dependent Parameters. J. R. Statist. Soc. B, 32, 312–322.
  • Visbeck et al. (2003) Visbeck, M., Chassignet, E. P., Curry, R. G., Delworth, T. L., Dickson, R. R. and Krahmann, G. (2003) The Ocean’s Response to North Atlantic Oscillation Variability. Atlantic, 15, 2581–2597.
  • Walker (1924) Walker, G. T. (1924) Correlation in seasonal variation of weather, IX, a preliminary study of world weather. Memoirs of the India Meteorological Department, 24, 275–332.
  • West and Harrison (1989) West, M. and Harrison, P. J. (1989) Subjective intervention in formal models. Journal of Forecasting, 8, 33–53.
  • West and Harrison (1997) West, M. and Harrison, P. J. (1997) Bayesian Forecasting and Dynamic Models. Springer-Verlag New York, 2nd edn.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
118148
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description