Minding impacting events in a model of stochastic variance

Minding impacting events in a model of stochastic variance

S M Duarte Queirós    E M F Curado and F D Nobre Centro de Física do Porto, Rua do Campo Alegre 687, 4169-007, Porto, Portugal Centro Brasileiro de Pesquisas Físicas and National Institute of Science and Technology for Complex Systems, Rua Dr Xavier Sigaud 150, 22290-180, Rio de Janeiro - RJ, Brazil
Present address: Istituto dei Sistemi Complessi - CNR, Via dei Taurini, 19, 00185 Roma, Italy \mailtosdqueiro@gmail.com

We introduce a generalisation of the well-known ARCH process, widely used for generating uncorrelated stochastic time series with long-term non-Gaussian distributions and long-lasting correlations in the (instantaneous) standard deviation exhibiting a clustering profile. Specifically, inspired by the fact that in a variety of systems impacting events are hardly forgot, we split the process into two different regimes: a first one for regular periods where the average volatility of the fluctuations within a certain period of time is below a certain threshold, , and another one when the local standard deviation outnumbers . In the former situation we use standard rules for heteroscedastic processes whereas in the latter case the system starts recalling past values that surpassed the threshold. Our results show that for appropriate parameter values the model is able to provide fat tailed probability density functions and strong persistence of the instantaneous variance characterised by large values of the Hurst exponent (), which are ubiquitous features in complex systems.

05.90.+m, 05.40.-a, 89.65.Gh, 89.65.-s

Heteroscedastic processes, Fat-tail distributions, Perpetual memory

1 Introduction

For the last years the physical community has broaden its subject goals to matters that some decades ago were too distant from the classical topics of Physics. Despite being apparently at odds with the standard motivations of Physics, this new trend has given an invaluable contribution toward a more connected way of making Science, thus leading to a better understanding of the world surrounding us [1]. Within this context, the major contribution of physicists is perhaps the quantitative procedure, reminiscent of experimental physics, in which a model is proposed after a series of studies that pave the way to a reliable theory. This path has resulted in a series of findings which have helped such diverse fields as physiology, sociology and economics, among many others [2, 3, 4]. Along these findings, one can mention the determination of non-Gaussian distributions and long-lasting (power-law like) correlations [5, 6, 7]. Actually, by changing the observable, the conjunction of the two previous empirical verifications is quite omnipresent. For this reason and regardless the realm of the problem very similar models have been applied with particular notoriety to discrete stochastic processes of time-dependent variance based on autoregressive conditional heteroscedastic models [8]. That is to say, most of these models are devised taking basically into account the general features one aims at reproducing, rather than putting in elements that represent the idiosyncracies of the system one is surveying. For instance, many of the proposals cast aside the cognitive essence prevailing on many of these systems, when it is well known that in real situations this represents a key element of the process [9]. On the other hand, intending to describe long-lasting correlations, long-lasting memories are usually introduced thus neglecting the fact that we do not traditionally keep in mind every happening. As a simple example, we are skilled at remembering quotidian events for some period. However, we will discard that information as time goes by, unless the specific deed either created an impact on us or has to do with something that has really touched us somehow. In this case, it is likely that the fact will be remembered forever and called back in similar or related conditions, which many times lead to a collective memory effect [10].

In this work, we make use of the celebrated heteroscedastic model, the process [11] and modify it by pitching at accommodating cognitive traits that lead to different behavior for periods of high agitation or impact. Particularly, we want to stress on the fact that people tend to recall important periods, no matter when they took place. To that end, we introduce a measure of the local volatility, as well as a volatility threshold, so that the system changes from a normal dynamics, in which it uses the previous values of the variable to determine its next value, to a situation in which it recalls the past and compares the current state with previous states of high volatility, even if this past is far.

1.1 Standard models of heteroscedasticity

The Engle’s formulation of an autoregressive conditional heteroscedastic () time series [11] represents one of the simplest and effectual models in Economics and Finance, for which he was laureated the Nobel Memorial Prize in Economical Sciences in 2003 [12]. Explicitly, the corresponds to a discrete time, , process associated with a variable, ,


with being an independent and identically distributed random variable with zero mean and standard deviation equal to one. The quantity represents the time-dependent standard deviation, which we will henceforth name  instantaneous volatility for mere historical reasons. Traditionally, a Gaussian is assigned to the random variable , but other distributions, namely the truncated -stable Lévy distribution and the -Gaussian (Student-) have been successfully introduced as well [13, 14]. In his seminal paper, Engle suggested that the values of  could be obtained from a linear function of past squared values of ,


In financial practice, viz., price fluctuations modelling, the case () represents the very most studied and applied of all the -like processes. The model has been often applied in cases where it is assumed that the variance of the observable (or its fluctuation) is a function of the magnitudes of the previous occurrences. In a financial perspective, Engle’s proposal has been associated with the relation between the market activity and the deviations from the normal level of volatility , and the previous price fluctuations making use of the impact function [8]. Alternatively, recent studies convey the thesis that leverage can be responsible for the volatility clustering and fat tails in finance [15]. Nonetheless, the heteroscedastic -like processes has been repeatedly used as a forecasting method. In other words, one makes use of the magnitude of previous events in order to indicate (or at least to bound) the upcoming event (see e.g. [16, 17]). In respect of its statistical features, although the time series is completely uncorrelated, , it can be easily verified that the covariance is not proportional to . As a matter of fact, for , it is provable that decays according to an exponential law with a characteristic time . This dependence does not reproduce most of the empirical evidences, particularly those bearing on price fluctuations studies. In addition, the introduction of a large value of used to give rise to implementation problems [18]. Expressly, large values of augment the difficulty of finding the appropriate set of parameters for the problem under study as it corresponds to the evaluation of a large number of fitting parameters. Aiming to solve this short-coming of the original process, the process was introduced [19] (where stands for generalised), with Eq. (2) being replaced by,


In spite of the fact that the condition, , guarantees that the process exactly corresponds to an infinite-order process, an exponential decay for , with is found.

Although the instantaneous volatility is time dependent, the process is actually stationary with the stationary variance given by,


(herein represents averages over samples at a specified time and denotes averages over time in a single sample). Moreover, it presents a stationary probability density function (PDF), , with a kurtosis larger than the kurtosis of distribution . Namely, the fourth-order moment is,

This kurtosis excess is precisely the outcome of the dependence of on the time (through ). Correspondingly, when , the process is reduced to generating a signal with the same PDF of , but with a standard variation equal to . At this point, it is convenient to say that, for the time being and despite several efforts, there are only analytical expressions describing the tail behaviour of or the continuous-time approximation of the (1) process with the full analytical formula still unknown [14, 20].

In order to cope with the long-lasting correlations and other features such as the asymmetry of the distribution and the leverage effect, different versions of the process have been proposed [8, 18]. To the best of our knowledge, every of them solve the issue of the long-lasting correlations of the volatility by way of introducing an eternal dependence on in Eq. (2), , with representing a slowly decaying function [21, 22]. Most of these generalisations can be encompassed within the fractionally integrated class of processes, the [23, 24, 25]. The idea supporting the introduction of a power-law for the functional form of is generally based on the assumption that the agents in the market make use of exponential functions with a broad distribution of relaxation times related to different investment horizons [26, 27]. This type of model has achieved a huge popularity in the replication of non-Gaussian time series in several areas, such as biomedicine, climate, engineering, and physics (a few examples can be found in [28, 29, 30, 31, 32, 33]).

As described above, the statistical features of the macroscopic observables are the result of the nature of the interactions between the microscopic elements of the system and the relation between microscopic as well as the macroscopic observables. In the case of the “financial” process, it was held that bears upon the impact of the price fluctuations on the trading activity. On the one hand, it is understood that the impact of the price fluctuations (or trading activity) on the volatility does not merely come from recent price fluctuations and it does actually involve past price fluctuations. In finance, upgraded versions of heteroscedasticity models use multi-scaling, i.e., it is assumed that the price will evolve by modulating the volatility according to the volatility over different scales (days, weeks, months, years, etc.) [34] in order to smooth their possible misjudgement about the volatility. However, in practice, these models do not differ much from -like proposals at the level of the results we are pointing at. Alternately, it is worthwhile to look upon the proposal as a mechanism of forecast [16, 17]. In this way, the simplest approach, the , represents an attempt to foresee future values just taking into account recent observations, whereas models like the bear in mind all the history weighting each past-value according to some kernel functional.

1.2 Minding impacting events

In our case, we want to emphasise the fact that people tend to recall periods of high volatility (i.e., impact) in the system, no matter when they took place, by changing the surrounding conditions as agent-based models suggested [35, 36]. Hence, we introduce a measure of the local volatility,


and a threshold, , so that instead of Eq. (2), the updating of goes as follows:


where  [37]. Therefore, if we assume the financial market perspective, we are implicitly presuming that the characteristic time, , is Dirac delta or at least narrow distributed, so that the exponential functional is a valid approximation. This approach is confirmed by recent heuristic studies in which it has been verified that the largest stake of the market capitalisation is managed by a small number of companies that apply very similar strategies [38]. With the second branch equation we intend to highlight the difference in behaviour of the “normal” periods of trading and the periods of significant volatility, in which the future depends on the spells of significant volatility in the past as well. The values are defined as,


with being the Heaviside function and is a factor that represents a measure of the similarity (in the volatility space) between the windows of size with upper limits at and , respectively. Analytically, this is equivalent to mapping segments in the form into vectors in and afterward computing a normalised internal product-like weight,


where, for the sake of simplicity, we set aside the time dependence of and in the equations, while represents the normalisation factor such that for all (with fixed ).

We are therefore dealing with a model characterised by 5 parameters, namely: (the normal level of volatility) and (the impact of the observable in the volatility), which were both first introduced by Engle in [11]; , put forward in exponential models [37]; and two new parameters (representing the volatility spell) and that we will reduce to a single extra parameter. If we think of trading activities, our proposal introduces a key parameter, the volatility threshold, , which signals a change in behaviour of the agents in the market. At present, significant stake of the trading in financial markets is dominated by short-term positions and thus a good part of the dynamics of price fluctuations can be described by Eq. (2), or by functions with an exponential kernel. As soon as the market fluctuates excessively, i.e., the volatility soars beyond the threshold, the market changes its trading dynamics. The main forecast references are obviously the periods where the volatility has reached high levels and afterward, the periods of those which are most similar; this is the rationale described by our Eq. (8). Thence, our proposal is nothing but the use of simple mechanisms that in a coarse-grained way master a good part of our decisions.

2 Results

2.1 General results

In this section we present the results obtained by the numerical implementation of the model. For comparison, we will use the results of a prior model that can be enclosed in the class of processes [25]. There, the adjustment of the parameters comes from the delicate balance between the parameter , which is responsible for introducing deviations of the volatility from its normal level , and the parameter controlling the memory. On the one hand, large memory has the inconvenient effect of turning constant the instantaneous volatility, so that after a seemly number of time steps the value of becomes constant, hence leading to a Gaussian (or close to it) distribution of the variable , independently of how large is. On the other hand, short memory is unable to introduce long-range correlations in the volatility, although it enhances larger values of kurtosis excess. The model we introduce herein is rather more complex. In order to deal with the change of regime, we define a parameter establishing this alteration and we need to specify and . Henceforth, we have assumed , which is very reasonable as it imposes that the volatility and the time scale that the agents in the market use to assess the evolution of the observable are the same. In order to speed up our numerical implementation, we have imposed a cut-off of in the computation of the first line in Eq. (6). This approximation turns the numerical procedure much lighter with a negligible effect because the influence of the discarded past is not much relevant in numerical terms (within standard numerical implementation error). In all of our realisations, we have used a normalised level of expected volatility, , and we have defined the volatility threshold in units of , following a stationary approach, as well.

We have adjusted the probability distributions of by means of the distribution,


the behaviour of which follows a power-law distribution for large with an exponent equal to and where (using Ref. [39], sec.  3.194),


and represents the previous integral with . The fittings for the probability density distribution (9) were obtained using non-linear and maximum log-likelihood numerical procedures and the tail exponents double-checked with the value given by the Hill estimator [40]. As a matter of fact, values of different from have only been perceived for large values of and small values of (slightly larger) or large values of (slightly smaller). For and , the PDF corresponds to a -Gaussian distribution (or Student- distribution) [41] and when we have either the Gaussian () or the stretched distribution (). Since that in the majority of the applications one is interested in the tail behaviour, we have opted for following the same approach by defining the tail index as,


In spite of the fact that other functional forms could have been used, we have decided on Eq. (9) because of its statistical relevance and simplicity (in comparison with other candidates involving special functions, namely the hypergeometric). Moreover, the -Gaussian (-Student) is intimately associated with the long-term distribution of heteroscedastic variables since it results in the exact distribution when the volatility follows an inverse-Gamma distribution [33, 42, 43].

Concerning the persistence of the volatility, we have settled on the Detrended Fluctuation Analysis (DFA) [44], which describes the scaling of a fluctuation function related to the average aggregated variance over segments of a time series of size ,


where is the Hurst exponent. Although it has been shown that Fluctuation Analysis methods can introduce meaningful errors in the Lévy regime [45], we have verified that for our case, which stands within the finite second-order moment domain, the results of DFA are so reliable as other scaling methods.

Table 1: Critical values from the Kolmogorov-Smirnov test for typical pairs used for adjustments.

Let us now present our results for , which is able to depict the qualitative behavior of the model for small . This case corresponds to a situation of little deviation from the Gaussian, when long-range memory is considered. In accordance, we can analyse the influence of the threshold and . Overall, we verify a very sparse deviation from the Gaussian. Keeping fixed and varying , we understand that for small values of the distribution of is Gaussian and the Hurst exponent of is . It is not hard to grasp this observation if we take into account that, by using small values of , we are basically employing almost all of the past values which limits the values of instantaneous volatility to a constant value after a transient time. As we increase the value of , we let the dynamics be more flexible and therefore the volatility is able to fluctuate, resulting in a kurtosis excess. For small values of , the Hurst exponent is slenderly different from and the value of the Hurst exponent increases with . However, because of the small value of , the rise of turns out the distribution of barely undistinguishable from a Gaussian. This behaviour is described in Fig. 1. We have obtained a Gaussian distribution and a Hurst exponent for small values of () and (). When we augment the value of the threshold, , the system is loose and the instantaneous volatility is able to fluctuate leading to the emergence of tails () and a subtle increase of the Hurst exponent (). Hiking up both and ( and ), we have achieved large values of the Hurst exponent (), but the small value of is not sufficient to induce relevant fluctuations, bringing on a distribution that is almost Gaussian (). The distribution fittings were assessed by computing the critical value from the Kolmogorov-Smirnov test [46] that are equal to and , respectively.

Figure 1: Left column: Probability density functions vs in a log-linear scale; Right column: Fluctuation function vs for in a log-log scale. The values of the model parameters are: yielding and (upper panels); yielding and (middle panels); yielding and (lower panels). The results have been obtained from series of elements and the numerical adjustment of gave values of never greater than 0.00003, with never smaller than 0.998.

As we increase the value of , we favour the contribution of the past values of the price dynamics, thus, for the same value of we are capable of achieving larger values of the kurtosis excess, that we represent by means of the increase of the index. The same occurs for the Hurst exponent. This general scenery is illustrated in Fig. 2 for the value where we present the dependence of and with , for different choices of . Again, the higher , the lower the tail index , because the extension of the memory surges a weakening of the fluctuations in the volatility. The opposite occurs with the Hurst exponent, which increases towards unit (ballistic regime) as we consider larger, for obvious reasons. In all the cases of investigated, we verified that both and augment with . The assessment of the numerical adjustments is provided in Tab. 1 in the form of the critical values from the Kolmogorov-Smirnov test [46]. The only case we obtained a value (within a five-digit precision) was for the pair and , which results in a value quite close to the limit of finite second-order moment (a fat-tailed distribution with ). At this point it is worth saying that we have investigated the likelihood of other well-known continuous distributions, such as the stretched-exponential, the simple -Student, Lévy, and Gaussian. Nonetheless, the fittings carried with Eq. (9) outperformed every other analysed distribution.

Figure 2: Upper panel: Value of the tail index vs parameter for several values of and according to the adjustment procedures mentioned in the text. Lower panel: Hurst exponent vs . The results have been obtained from series of elements and the numerical adjustment of gave values of never greater than 0.00003 with never smaller than 0.9998. Regarding the values of the Hurst exponent, the absolute error has never been greater that and a linear coefficient .

Concerning the instantaneous volatility, , we verified that the Dirac delta distribution, , starts misshaping and short tails appear as we depict in Fig. 4 (upper panel) for the case , and . Considering this particular case, we can present relevant evidence of the effectiveness of our proposed probability distribution approach. The empirical distribution function in the upper panel of Fig. 4 may be simply approximated by


with , , and ; when we recover the homoscedastic process distribution as a particular case. Reminding that at each time step the distribution is a Gaussian (conditioned to a time-dependent value of ) the long-term distribution is,


which gives (Ref. [39], sec. 3.351),


where is the Exponential Integral function (see e.g. Ref. [48]). Considering (which is appropriate to the case shown) and taking for the sake of simplicity , we obtain the function presented in Fig. 3111Actually, this curve is represented in the scaled variable so that the standard deviation, which is originally equal to , becomes equal to one, like in other depicted distributions., the kurtosis of which is (making use of Ref. [39], sec. 5.221). The accordance between this distribution and the empirical distribution is quite remarkable since it emerges from no numerical adjustment and can be further improved by tuning the values of and . Regardless, this kurtosis value is only larger than our numerical adjustment (see Table 1 for the goodness of fitting). Furthermore, comparing the distributions by means of the symmetrised Kullback-Leibler divergence , we obtain a value of 0.00014 that is 19 times smaller than the distance between our fitting and a Gaussian. These results show that the PDF of Eq. (9) not only provides a good description of the data, but it is much more manageable as well.

Figure 3: The points represent the empirical distribution function for , and ; the dashed red line is our adjustment with Eq. (9) with , and [ and ]; the green line is PDF (15) with and the dotted cyan line is the Normal distribution.

Cases for which the kurtosis excess is relevant () stem from wider distributions of (see the lower panel of Fig. 4). Actually, it is the emergence of larger values of the instantaneous volatility that brings forth fat tails. Although we have not been successful in describing the whole distribution, we have verified that, for values of , the distribution is very well described by a type-2 Gumbel distribution,


and after certain value of the distribution sharply decreases according to a power-law with a large exponent. We credit this sheer fall to the threshold , which introduces a sharp change in the dynamical regime of the volatility and thus in its statistics.

Figure 4: Probability density function of the instantaneous volatility vs for two different cases with . Upper panel: and which leads to a sharply peaked distribution around and to a tail index . Lower panel: and that results in a broader distribution largely described by a type-2 Gumbel distribution with and ( and ). For , changes its behavior to a faster decay with an exponent equal to represented by the gray symbols. The ANOVA test of the type-2 Gumbel adjustment (up to ) have yielded a sum of squares of ( degrees of freedom) and ( degrees of freedom) for the error and the model, respectively. The uncorrected value of the sum of squares is ( degrees of freedom) and the corrected total is ( degrees of freedom). The empirical distribution function has been obtained from series of elements.

In finance, such a cut-off is more than plausible as real markets do suspend trading when large price fluctuations occur. This also grants feasibility to descriptions based on truncated power-law distributions [6]. Moreover, a fall off is also presented in the quantity of Fig. 3 in Ref. [47]. It is known that for heteroscedastic models the tail behavior of the long-term distribution is governed by the asymptotic limit of when tends to infinity. For the case of distribution (16), this limit is the power-law and therefore we can verify that the asymptotic behaviour of the long-term distribution of the variable ,

yields a power-law distribution (applying Ref. [39], sec. 3.326),


For following an exponential decay in the form exp, a similar procedure yields,


where is the Meijer G-function [39, 48].222In an effort to obtain a full description of we also used a function such as which allows the appearance of a crossover from a power law to an exponential decay. Nonetheless, it did not provide better results.

It is worth saying that we can reduce the number of parameters to , and , i.e., apply the simple process, and obtain fat tails and persistence still.

2.2 Comparison with a real system

Following this picture, we can now look for a set of parameters that enable us to replicate a historic series such as the daily (adjusted333The adjusted values of the index take into account dividend payments and splits occurred in a particular day.) log-index fluctuations, , of the SP500 stock index, , between 3rd January 1950 and 12th April 2010 (14380 data points) with,


Inspecting over a grid of values of , and , we have noted that the values of , and , respectively, yield values of and for that are in good agreement with a prior analysis of which gave (using a simple -Student distribution) and [, and ](using the PDF of Eq. (9)) and persistence exponent (see Fig. 4). Comparing the numerical distribution of our model with the data we obtained and a critical value equal to from the two-sample Kolmogorov-Smirnov test [46],while the comparison between the distribution of the numerical procedure and the adjustment of the SP500 empirical distribution function yielded . Once again we have tested other possible numerical adjustments and the only other relevant distribution was the stretched exponential with which has given a different from , but a significantly larger value of [, ].

It is worthy to be mentioned that all the three values of the parameters are plausible. First, within an application context, is traditionally a value robustly greater than . Second, is close to the number of business days in a month and last, but not least, is somewhat above the average level of the mean variance presented above. This provides us with a very interesting picture of the dynamics. Specifically, at a relevant approximation we can describe this particular system as monitoring the magnitude of its past fluctuations with a characteristic scale of a month, from which it computes the level of impact resulting in an excess of volatility. Actually one month moving averages are established indicators in quantitative analyses of financial markets. When the volatility in a period of the same order of magnitude of surpasses the value , then the system recalls previous periods of time, no matter how long they happened, in which a significant level of volatility excess occurred. Those periods are then averaged in order to determine the level of instantaneous volatility .

Figure 5: Upper panels: On the left side, Probability density function vs for , and (full line) [ with and ] and the daily log-index fluctuations (symbols) [ with and ] in the log-linear scale and on the right side the complementary cumulative distribution function vs for case shown on the left. Lower panel: Fluctuation function vs for the same parameters above [, with ] (red circles) and for the daily log-index fluctuations [, with ] (black squares) in a log-log scale.

3 Discussion

We have studied a generalisation of the well-known process born in a financial context. Our proposal differs from other generalisations, since it adds to heteroscedastic dynamics the ability to reproduce systems where cognitive traits exist or systems showing typical cut-off limiting values. In the former case, when present circumstances are close to extreme and impacting events, the dynamics switches to the memory of abnormal events. By poring over the set of parameters of the problem, namely the impact of past values, , the memory scale, , and the volatility threshold, , we have verified that we are able to obtain times series showing fat tails for the probability density function and strong persistence for the magnitudes of the stochastic variable (directly related to the instantaneous volatility), as it happens in several processes studied within the context of complexity. In order to describe the usefulness of our model we have applied it to mimic the fluctuations of the stock index , we verified that the best values reproducing the features of its time series are close to one business month and greater that the mean variance of the process which is much larger than the normal level of volatility for which trading is not taken into account. Concerning the volatility, we have noticed that for the problems of interest (i.e., fat tails and strong persistence), the distributions are very well described by a type-2 Gumbel distribution in large part of the domain, which explains the emergence of the tails.


SMDQ thanks the warm hospitality of the CBPF and its staff during his visits to the institution sponsored by CNPq and National Institute of Science and Technology for Complex Systems and the financial support of the European Commission through the Marie Curie Actions FP7-PEOPLE-2009-IEF (contract nr 250589) in the final part of the work. EMFC and FDN acknowledge the financial support of CNPq and FAPERJ and CNPq, respetively. The current version of our work benefited from the comments of incognitos peers to whom we are grateful.



  • [1] Gell-Mann M (1995) The Quark and the Jaguar: Adventures in the Simple and the Complex. New York: Abacus.
  • [2] Flake G W (2000) The Computational Beauty of Nature: Computer Explorations of Fractals, Chaos, Complex Systems and Adaptation. Cambridge - MA: MIT Press.
  • [3] Gell-Mann M, Tsallis C (2004) Nonextensive Entropy: Interdisciplinary Applications. New York: Oxford University Press.
  • [4] Skjeltorp A T, Vicsek T (2002) Complexity from Microscopic to Macroscopic Scales: Coherence and Large Deviations. Dordrecht: Kluwer Academic Publishers.
  • [5] Bouchaud J P, Potters M (2000) Theory of Financial Risks: From Statistical Physics to Risk Management. Cambridge, UK: Cambridge University Press.
  • [6] Mantegna R N, Stanley H E (1999) An introduction to Econophysics: Correlations and Complexity in Finance. Cambrigde, UK: Cambridge University Press.
  • [7] Voit J (2003) The Statistical Mechanics of Financial Markets. Berlin: Springer-Verlag.
  • [8] Andersen T G, Bollerslev T, Diebold F X (2010) Parametric and nonparametric volatility measurement. In: Aït-Sahalia Y, Hansen L P, editors. Handbook of Financial Econometrics, volume 1, tools and techniques. Amsterdam: Elsevier. pp. 67-102.
  • [9] Khrennikov A Y (2004) Information Dynamics in Cognitive, Psychological, Social, and Anomalous Phenomena. Dordrecht: Kluwer Academic Publishers: Dordrecht.
  • [10] Lee S, Ramenzoni VC, Holme P (2010) Emergence of Collective Memories. PLoS ONE 5(9): e12522. doi:10.1371/journal.pone.0012522.
  • [11] Engle R F (1982) Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50: 987.
  • [12] http://nobelprize.org/nobel_prizes/economics/laureates/.
  • [13] Pobodnik B, Ivanov P Ch, Lee Y, Cheesa A, Stanley H E (2000) Systems with correlations in the variance: Generating power law tails in probability distributions. Europhys. Lett. 50: 711.
  • [14] Duarte Queirós S M, Tsallis C (2005) Bridging a paradigmatic financial model and nonextensive entropy. Europhys. Lett. 69: 893.
  • [15] Thurner S, Farmer J D, Geanakoplos J (2010) Leverage causes fat tails and clustered volatility. Preprint arXiv:0908.1555 [q-fin.ST]
  • [16] Donaldson R G, Kamstra M (1996) A new dividend forecasting procedure that rejects bubbles in asset prices: the case of 1929’s stock crash. Rev. Financ. Stud. 9: 333.
  • [17] Hauser M A, Kunst R M (2001) Forecasting high-frequency financial data with the ARFIMAARCH model. J. Forecasting 20: 501.
  • [18] Bollerslev T, Chou R Y, Kroner K F (1992) ARCH models in finance. J. Econometrics 52: 5.
  • [19] Ding Z, Granger C W J, Engle R F (1983) A long memory property of stock market returns and a new model. J. Emp. Fin. 1: 83.
  • [20] Embrechts P, Kluppelberg C, Mikosch T (1997) Modelling Extremal Events for Insurance and Finance (Applications of Mathematics). Berlin: Springer-Verlag; Nelson D B (1990) ARCH models as diffusion approximations. J. Econometrics 45: 7.
  • [21] Gourieroux C, Montfort A (1996) Statistics and econometric models. Cambridge, UK: Cambridge University Press.
  • [22] Andersen T G, Bollerslev T, Diebold F X (2005) Volatility Forecasting. Penn Institute for Economic Research Working Paper 05-011.
  • [23] Granger C W J, Ding Z (1996) Modeling volatility persistence of speculative returns: a new approach. J. Econometrics 73: 61; Roman H E, Porto M (2008) Fractional brownian motion with stochastic variance: Modeling absolute returns in stock markets. Int. J. Mod. Phys. C 19: 1221.
  • [24] Porto M, Roman H E (2001) Self-generated power-law tails in probability distributions. Phys. Rev. E 63: 036128.
  • [25] Duarte Queirós S M (2007) On a generalised model for time-dependent variance with long-term memory. Europhys. Lett. 80: 30005.
  • [26] Dacorogna M, Gençay R, Müller U, Olsen R, Pictet O (2001) An Introduction to High-Frequency Finance. London: Academic Press.
  • [27] Bouchaud J P (2002) An introduction to statistical finance. Physica A 313: 238.
  • [28] B Hoskins (2006) Predictability of Wheather and Climate. In: Palmer T, Hagedorn R, editors. Cambridge, UK: Cambridge University Press; Podobnik B, Ivanov P Ch, Biljakovic K, Horvatic D, Stanley H E, Grosse I (2005) Fractionally integrated process with power-law correlations in variables and magnitudes. Phys. Rev. E 72: 026121.
  • [29] Campbell S, Diebold F X (2005) Weather forecasting for weather derivatives. J. Am. Stat. Ass. 100: 6.
  • [30] Martin-Guerrero J D, Camps-Valls G, Soria-Olivas E, Serrano-Lopez E J, Perez-Ruixo J J, Jimenez-Torres N V (2003) Dosage individualization of erythropoietin using a profile-dependent support vector regression. IEEE Trans. Biomed. Eng. 50: 1136.
  • [31] Gronke P, Brehm J (2002) History, heterogeneity, and presidential approval: a modified ARCH approach. Elect. Stud. 21: 425.
  • [32] Reynolds A M, Mordant N, Crawford A M, Bodenschatz E (2005) On the distribution of Lagrangian accelerations in turbulent flows. New J. Phys. 7: 58; Beck C (2007) Phys. Rev. Lett. 98: 064502.
  • [33] Beck C (2001) Dynamical foundations of nonextensive statistical mechanics. Phys. Rev. Lett. 87: 180601.
  • [34] Zumbach G (2004) Volatility processes and volatility forecast with long memory. Quant. Finance 4: 70.
  • [35] Lux T and Marchesi M (1999) Scaling and Criticality in a Stochastic Multi-Agent Model of a Financial Market. Nature 397: 498; Lux T and Marchesi M (2000) A Micro-Simulation of Interactive Agents. Int. J. Theor. Appl. Fin. 3: 675.
  • [36] Giardina I and Bouchaud J P (2003) Bubbles, crashes and intermittency in agent based market models. Eur. Phys. J. B 31: 421.
  • [37] Dose C, Porto M and Roman H E (2003) Autoregressive processes with anomalous scaling behavior: Applications to high-frequency variations of a stock market index. Phys. Rev. E 67: 067103; Borland L (2004) A multi-time scale non-Gaussian model of stock returns. Preprint arXiv:cond-mat/0412526.
  • [38] Borland L, private communication.
  • [39] Gradshteyn I S, Ryzhik I M (1965) Tables of integrals, series and products. London: Academic Press.
  • [40] Hill B M (1975) A simple general approach to inference about the tail of a distribution. Ann. Stat. 3: 1163; Clementi F, Di Matteo T and Gallegati M (2006) The power-law tail exponent of income distributions. Physica A 370: 49.
  • [41] Tsallis C (1999) Nonextensive statistics: theoretical, experimental and computational evidences and connections. Braz. J. Phys. 29: 1.
  • [42] Beck C and Cohen E G D (2003) Superstatistics. Physica A 322: 267.
  • [43] Queirós S M D and Tsallis C (2005) On the connection between financial processes with stochastic volatility and nonextensive statistical mechanics. Eur. Phys. J. B 48: 139; Queirós S M D (2008) On discrete stochastic processes with long-lasting time dependence in the variance. Eur. Phys. J. B 66: 137.
  • [44] Peng C-K, Buldyrev S V, Havlin S, Simons M, Stanley H E and Goldberger A L (1994) Mosaic organization of DNA nucleotides. Phys. Rev. E 49: 1685.
  • [45] Barunik J, Kristoufek L (2010) On Hurst exponent estimation under heavy-tailed distributions. Physica A 389: 3844.
  • [46] DeGroot M H (1991) Probability and Statistics, 3rd ed. Reading, MA: Addison-Wesley.
  • [47] Borland L and Bouchaud J P (2005) On a multi-timescale statistical feedback model for volatility fluctuations. Preprint arXiv:physics/0507073v1 [physics.soc-ph].
  • [48] Meijer-G function, http://functions.wolfram.com/.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
Add comment
Loading ...
This is a comment super asjknd jkasnjk adsnkj
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test description