Book summary
Book cover
Keywords
Automatic Estimation of Trends; Average Conditional Displacement; Discrete Stochastic Processes; Monte Carlo Experiment; Noise Smoothing; Noisy Time Series; Polynomial Fitting; Time Series Partitioning; Trend Estimation Algorithms
1 Introduction
1.1 Discrete Stochastic Processes and Time Series
1.2 Trend Definition and Estimation
1.3 AR(1) Stochastic Process
2 Monte Carlo Experiments
2.1 Monte Carlo Statistical Ensembles
2.2 Numerical Generation of Trends
2.3 Numerical Generation of Noisy Time Series
2.4 Statistical Hypothesis Testing
3 Polynomial Fitting
3.1 Polynomial Fitting
3.2 Polynomial Fitting of Artificial Time Series
3.3 An Astrophysical Example
4 Noise Smoothing
4.1 Repeated Central Moving Average
4.2 Smoothing of Artificial Time Series
4.3 A Financial Example
5 Automatic Estimation of Monotonic Trends
5.1 Average Conditional Displacement (ACD) Algorithm
5.2 Automatic ACD Algorithm
5.3 Evaluation of the ACD Algorithm
5.4 A Climatological Example
5.5 Monotonic Components of Nonmonotonic Trends
6 Estimation of Monotonic Trend Segments from a Noisy Time Series
6.1 Time Scale of Local Extrema
6.2 Local Extrema of Noisy Time Series
6.3 Local Extrema of RCMA Trends
6.4 Significant Local Extrema of a Real Time Series
7 Automatic Estimation of Arbitrary Trends
7.1 Automatic RCMA (AutRCMA)
7.2 Statistical Significance of the Local Extrema of the AutRCMA Trend
Chapter
Ch. 1 Introduction
A complete presentation of the theory of stochastic processes can be found in any treatise on the probability theory and time series theory. In this introductory chapter we briefly present some basic notions which are used in the rest of the book. The main methods to estimate trends from noisy time series are introduced in Sect. 1.2. In the last section we discuss the properties of the order one autoregressive stochastic process AR(1) which has the serial correlation described by a single parameter and which is a good first approximation for many noises encountered in real phenomena.
[1] Alexandrov, T., Bianconcini, S., Dagum, E.B., Maass, P., McElroy, T. S.: A review of some modern approaches to the problem of trend extraction. Research Report Series US Census Bureau, Statistics 3. http://www.census.gov/srd/papers/pdf/rrs2008-03.pdf (2008)
[2] Blender, R.: Renormalisation group analysis of autoregressive processes and fractional noise. Phys. Rev. E 64, 067101 (2001) ADS, CrossRef, Google Scholar
[3] Brockwell, P.J., Davies, R.A.: Introduction to Time Series and Forecasting. Springer, New York (1996) zbMATH, Google Scholar
[4] Brockwell, P.J., Davies, R.A.: Time Series:Theory and Methods. Springer, New York (1996) zbMATH, Google Scholar
[5] Gao, J., Hu, J., Tung, W., Cao, Y., Sarshar, N., Roychowdhury, V.P.: Assessment of long range correlation in time series: How to avoid pitfalls. Phys. Rev. E 73, 016117 (2006) ADS, CrossRef, Google Scholar
[6] Guzman-Vargas, L., Angulo-Brown, F.: Simple model of the aging effect in heart interbeat time series. Phys. Rev. E 67, 052901 (2003) ADS, CrossRef, Google Scholar
[7] Hallerberg, S., Altmann, E.G., Holstein, D., Kantz, H.: Precursors of extreme increments. Phys. Rev. E 75, 016706 (2007) MathSciNet, ADS, CrossRef, Google Scholar
[8] Király, A., Jánosi, I.M.: Stochastic modeling of daily temperature fluctuations. Phys. Rev. E 65, 051102 (2002) ADS, CrossRef, Google Scholar
[9] Kugiumtzis, D.: Statically transformed autoregressive process and surrogate data test for nonlinearity. Phys. Rev. E 66, 025201 (2002) ADS, CrossRef, Google Scholar
[10] Liley, D.T., Cadusch, P.J., Gray, M., Nathan, P.J.: Drug-induced modification of the system properties associated with spontaneous human electroencephalographic activity. Phys. Rev. E 68, 051906 (2003) ADS CrossRef, Google Scholar
[11] Maraun, D., Rust, H.W., Timmer, J.: Tempting long-memory—on the interpretation of DFA results. Nonlinear Proc. Geoph. 11, 495–503 (2004) ADS, CrossRef, Google Scholar
[12] Palus, M., Novotna, D.: Sunspot cycle: a driven nonlinear oscillator? Phys. Rev. Lett. 83, 3406–3409 (1999) ADS, CrossRef, Google Scholar
[13] Stephen, D., Pollock, G.: Statistical signal extraction and filtering: a partial survey. In: Belsley, D.A., Kontoghiorghes, E. (eds.) Handbook of Computational Econometrics, pp. 321–376. Wiley, New York (2009).Google Scholar
[14] Timmer, J., Schwarz, U., Voss, H.U., Wardinski, I., Belloni, T., Hasinger, G., van der Klis, M., Kurths, J.: Linear and nonlinear time series analysis of the black hole candidate Cygnus X-1. Phys. Rev. E 61, 1342–1352 (2000) ADS, CrossRef, Google Scholar
[15] Vamoş, C.: Automatic algorithm for monotone trend removal. Phys. Rev. E 75, 036705 (2007) ADS, CrossRef, Google Scholar
[16] Vamoş, C., Şoltuz, Ş.M., Crăciun, M.: Order 1 autoregressive process of finite length. Rev. Anal. Numer. Theor. 36, 199–214 (2007) Google Scholar
[17] Wu, Z., Huang, N.E., Long, S.R., Peng, C.K.: On the trend, detrending, and variability of nonlinear and nonstationary time series. PNAS 18, 14889–14894 (2007) ADS, CrossRef, Google Scholar
[18] Wentzell, A.D.: A Course in the Theory of Stochastic Processes. McGraw-Hill, New York (1981) zbMATH, Google Scholar
Chapter
Ch. 2 Monte Carlo Experiments
In this chapter we design a numerical algorithm to generate nonmonotonic trends with a diversity of shapes comparable to those encountered in practice. This original algorithm is essential for all the rest of the book because it provides the numerical trends on which the estimation methods are tested. Over these trends finite AR(1) noises are superposed so that the resulting artificial time series depend on five independent parameters. In the case of the trend estimation algorithms the complexity of the problem is reduced because the accuracy of the estimated trend significantly depends only on three parameters: the time series length, the noise serial correlation, and the ratio between the amplitudes of the trend variations and noise fluctuations. Using Monte Carlo experiments we derive the accuracy of a simple method to estimate the serial correlation of an AR(1) noise.
[1] Box, G., Jenkins, G., Reinsel, G.: Time Series Analysis: Forecasting and Control, 3rd edn. Prentice-Hall, Upper Saddle River (1994) Google Scholar
[2] Box, G.E.P., Pierce, D.A.: Distribution of the autocorrelations in autoregressive moving average time series models. J. Am. Stat. Assoc. 65, 1509–1526 (1970) Google Scholar
[3] Brockwell, P.J., Davies, R.A.: Time Series: Theory and Methods, 2nd edn. Springer, New York (1996) Google Scholar
[4] Fornberg, B.: A Practical Guide to Pseudospectral Methods. Cambridge University Press, Cambridge (1998) Google Scholar
[5] Hamilton, J.D.: Time Series Analysis. Princeton University Press, Princeton (1994) Google Scholar
[6] Hirsch, R.M., Slack, J.R.: A nonparametric trend test for seasonal data with serial dependence. Water Resour. Res. 20, 727 (1984) Google Scholar
[7] Kendall, M.G.: Rank Correlation Methods. Griffin, London (1975) Google Scholar
[8] Marsaglia, G., Tsang, W., Wang, J.: Evaluating kolmogorov’s distribution. J. Stat. Softw. 8(18), 1–4 (2003) Google Scholar
[9] Metropolis, N., Ulam, S.: The monte carlo method. J. Am. Stat. Assoc. 44, 335–341 (1949)Google Scholar
[10] Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C. The Art of Scientific Computing, 2nd edn. Cambridge University Press, Cambridge (1992)Google Scholar
[11] Vamoş, C., Crăciun, M.: Serial correlation of detrended time series. Phys. Rev. E 78, 036707 (2008)Google Scholar
Chapter
Ch. 3 Polynomial fitting
In this chapter we analyze the well-known polynomial fitting method by means of the Monte Carlo experiments with artificial time series generated by the algorithm presented in the previous chapter. Unlike the theoretical results obtained in mathematical statistics, our conclusions are valid for arbitrary trends, not only for polynomial trends. The accuracy of the estimated polynomial trend depends mainly on the ratio r between amplitudes of the trend variations and noise fluctuations. When the noise is small (r>1), the estimated trend has a strong resemblance with the real trend and the noise serial correlation has negligible influence on it. Conversely, when the time series is dominated by noise (r<1), the accuracy significantly decreases and it becomes even worse for noises with strong serial correlation. We conclude that the polynomial fitting is recommended for time series with small noise and simple trend with small number of local extrema. The example from astrophysics shows that the optimum degree of the polynomial trend can be determined by searching the most suited stochastic model for the noise contained in the time series.
[1] Andreas, E.L., Trevino, G.: Using wavelets to detect trends. J. Atmos. Ocean. Technol. 14, 554–564 (1997) ADS, CrossRef, Google Scholar
[2] Brockwell, P.J., Davies, R.A.: Time Series: Theory and Methods. Springer, New York (1996) zbMATH, Google Scholar
[3] Hamilton, J.D.: Time Series Analysis. Princeton University Press, Princeton (1994) zbMATH, Google Scholar
[4] Morariu, V.V., Vamoş, C., Pop, A., Şoltuz, Ş.M., Buimaga-Iarinca, L.: Autoregressive modeling of the variability of an active galaxy. Rom. J. Phys. 55, 676–686 (2010) Google Scholar
[5] Stoica, P., Moses, R.L.: Introduction to Spectral Analysis. Prentice Hall, New Jersey (1997) zbMATH, Google Scholar
[6] Timmer, J., Schwarz, U., Voss, H.U., Wardinski, I., Belloni, T., Hasinger, G., van der Klis, M., Kurths, J.: Linear and nonlinear time series analysis of the black hole candidate Cygnus X-1. Phys. Rev. E 61, 1342–1352 (2000) ADS, CrossRef, Google Scholar
Chapter
Ch. 4 Noise Smoothing
The central moving average (CMA) is one of the simplest and most used method to filter out the noise fluctuations from a time series and it depends on a single parameter, the semi-length KK of the averaging window. We introduce the repeated central moving average (RCMA) which depends on an additional parameter (the number ii of averagings) and allows a gradual smoothing of the time series. Using Monte Carlo experiments we analyze the properties of the RCMA with boundary conditions obtained by a normalized form of padding with zero of the time series. We show that roughly the same smoothing is obtained either by repeating many times an averaging with a small KK or by repeating fewer times an averaging with a large KK. We also prove that any form of moving average introduces a spurious serial correlation in the smoothed time series. The accuracy of the trend estimated by the RCMA depends on the ratio rr between the amplitudes of the trend variations and noise fluctuations and on the noise serial correlation in the same way as the accuracy of the estimated trend by polynomial fitting. The RCMA trend does not mainly depend on the number of the monotonic segments of the trend, but on the average resolution of the monotonic segments. The real time series of the returns of the S&P500 index is processed by the CMA in order to determine the financial volatility time series. The optimum semi-length of the averaging window is found using the condition that the estimated noise should be uncorrelated.
[1] Box, G., Jenkins, G., Reinsel, G.: Time Series Analysis: Forecasting and Control, 3rd edn. Prentice-Hall, Upper Saddle River (1994) Google Scholar
[2] Brockwell, P.J., Davies, R.A.: Time Series: Theory and Methods, 2nd edn. Springer, New York (1996) Google Scholar
[3] Escanciano, J.C., Lobato, I.N.: An automatic portmanteau test for serial correlation. J. Econ. 151, 140–149 (2009) Google Scholar
[4] Goodall, C.: A survey of smoothing techniques. In: Fox, J., Long, J.S. (eds.) Modern Methods of Data Analysis, p. 176. Sage Publications, Newbury Park (1990) Google Scholar
[5] Karlsson, G., Vetterli, M.: Extension of finite length signals for sub-band coding. Signal Process 17, 161–168 (1989) CrossRef, Google Scholar
[6] Poon, S.H.: A Practical Guide to Forecasting Financial Market Volatility. Wiley, Chichester (2005)Google Scholar
[7] Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C. The Art of Scientific Computing, 2nd edn. Cambridge University Press, New York (1992) Google Scholar
[8] Vamoş, C., Crăciun, M.: Separation of components from a scale mixture of Gaussian white noises. Phys. Rev. E 81, 051125 (2010) ADS CrossRef, Google Scholar
Chapter
Ch. 5 Automatic Estimation of Monotonic Trends
In this chapter we design an automatic algorithm to estimate monotonic trends over which an arbitrary stationary noise is superposed. It approximates the trend by a piecewise linear curve obtained by dividing into subintervals the time series values, instead of the time domain. The slope of each linear segment of the estimated trend is proportional to the average one-step displacement of the time series values included into the corresponding subinterval, therefore the method is referred to as average conditional displacement (ACD). Using Monte Carlo experiments we show that for AR(1) noises the accuracy of the ACD algorithm is comparable with that of the polynomial fitting and moving average but it has the advantage to be automatic. For time series with nonmonotonic trends the ACD algorithm determines one of the possible monotonic components which can be associated to the trend. As an illustration we apply the ACD algorithm to a paleoclimatic time series to determine the periods with a significant monotonic temperature variation.
[1] Andreas, E.L., Trevino, G.: Using wavelets to detect trends. J. Atmos. Ocean. Technol. 14, 554–564 (1997) Google Scholar
[2] Jones, P.D., Mann, M.E.: Climate over past millennia. Rev. Geophys. 42, 1–42 (2004) CrossRef, Google Scholar
[3] Jones, P.D., Mann, M.E.: Climate over past millenia. Data contribution series #2004-085. In: NOAA/NGDC Paleoclimatology Program, http://www.ncdc.noaa.gov/paleo/pubs/jones2004/ (2004)
[4] Kendall, M.G.: Rank Correlation Methods. Griffin, London (1975) zbMATH, Google Scholar
[5] Vamoş, C.: Automatic algorithm for monotone trend removal. Phys. Rev. E 75, 036705 (2007) ADS, CrossRef, Google Scholar
[6] Zhao, S., Wei, G.W.: Jump process for the trend estimation of time series. Comput. Stat. Data Anal. 42, 219–241 (2003) MathSciNet, zbMATH, CrossRef, Google Scholar
Chapter
Ch. 6 Estimation of Monotonic Trend Segments from a Noisy Time Series
An arbitrary nonmonotonic trend is composed by a succession of monotonic segments limited by its local extrema. A superposed noise breaks up the trend monotonic variations into many small fluctuations, but the global shape of the trend is recognizable because the trend local extrema have a larger time scale than those induced by noise. By rigorously defining the time scale of a local extremum we design an automatic algorithm to estimate the trend local extrema from a noisy time series. The estimation accuracy is improved if the noisy time series is first smoothed such that the noise fluctuations are damped. Using the ACD algorithm for monotonic trend estimation presented in the previous chapter we evaluate the significance of the estimated local extrema. As an example we analyze a biophysical time series for which we estimate the large scale monotonic segments of the trend.
[1] Fink, E., Gandhi, H.S.: Compression of time series by extracting major extrema. J. Exp. Theor. Artif. In. 23, 255–270 (2011) CrossRef, Google Scholar
[2] Höppner, F.: Time series abstraction methods—a survey. In: Schubert, S., Reusch, B., Jesse, N. (eds.) Proceedings GI Jahrestagung Informatik, Workshop on Knowledge Discovery in Databases, Dortmund, September 2002, pp. 777–786. Bonner Köllen Verlag, Bonn (2002) Google Scholar
[3] Huang, N.E., Shen, Z., Long, S.R., Wu, M.C., Shih, H.H., Zheng, Q., Yen, N.C., Tung, C.C., Liu, H.H.: The empirical mode decomposition and the Hilbert spectrum for nonlinear and nonstationary time series analysis. Proc. R. Soc. Lond. A 454, 903–995 (1998)MathSciNet, ADS, zbMATH, CrossRef, Google Scholar
[4] Keogh, E., Chu, S., Hart, D., Pazzani, M.: An online algorithm for segmenting time series. In: First IEEE International Conference on Data Mining (ICDM’01), pp. 289–296. IEEE Computer Society Press, Los Alamitos (2001) Google Scholar
[5] Lindeberg, T.: Effective scale: a natural unit for measuring scale-space lifetime. IEEE T. Pattern Anal. 15, 1068–1074 (1993) CrossRef, Google Scholar
[6] Vamoş, C., Crăciun, M.: Serial correlation of detrended time series. Phys. Rev. E 78, 036707 (2008) ADS, CrossRef, Google Scholar
[7] Verbeek, P.W., Vrooman, H.A., van Vliet, L.J.: Low-level image processing by max–min filters. Signal Process. 15, 249–258 (1988) CrossRef, Google Scholar
[8] Witkin, A.P.: Scale space filtering. In: Bundy, A. (ed.) Proceedings of the 8th International Joint Conference on Artificial Intelligence, Karlsruhe, FRG, August 1983, pp. 1019–1022. William Kaufmann Inc., Los Alto (1983) Google Scholar
Chapter
Ch. 7 Automatic Estimation of Arbitrary Trends
In this final chapter we transform the RCMA algorithm presented in Chap. 4 in an automatic algorithm. Instead of the two parameters controlling the RCMA we introduce a single parameter equal to the minimum distance between two successive local extrema of the smoothed time series. Its optimum value is determined as a function of the estimated serial correlation of the noise and of the estimated ratio between the amplitudes of the trend variations and noise fluctuations. The accuracy of the automatic RCMA is measured by Monte Carlo experiments and it is only slightly smaller than the maximum accuracy obtained by exhaustive search of all the RCMA trends. As an illustration we use the automatic RCMA to estimate the trend from a financial time series and by means of the partitioning algorithm presented in Chap. 6 we evaluate the significance of the local extrema of the estimated trend.
[1] Hamilton, J.D.: Time series analysis. Princeton University Press, Princeton (1994) Google Scholar
[2] Schumpeter, J.A.: Business cycles. A theoretical, historical and statistical analysis of the capitalist process. McGraw-Hill, New York (1939) Google Scholar
[3] Voit, J.: The statistical mechanics of financial markets, 3rd edn. Springer, Berlin (2005) Google Scholar
soon
Book coordinates
C. Vamos, M. Craciun, Automatic Trend Estimation, SpringerBriefs in Physics (Springer), 2012, pp. 131, ISBN: 978-94-007-4824-8,
DOI 10.1007/978-94-007-4825-5007-4825-5.
Book Title
Automatic Trend Estimation
Publisher
Springer
Print ISBN
978-94-007-4824-8