APPLIED TIME SERIES ANALYSIS WITH R Second Edition APPLIED TIME SERIES ANALYSIS WITH R Second Edition Wayne A Woodward Southern Methodist University Dallas, Texas, USA Henry L Gray Southern Methodist University Dallas, Texas, USA Alan C Elliott Southern Methodist University Dallas, Texas, USA CRC Press Taylor & Francis Group 6000 Broken Sound Parkway NW, Suite 300 Boca Raton, FL 33487-2742 © 2017 by Taylor & Francis Group, LLC CRC Press is an imprint of Taylor & Francis Group, an Informa business No claim to original U.S Government works Printed on acid-free paper Version Date: 20160726 International Standard Book Number-13: 978-1-4987-3422-6 (Hardback) This book contains information obtained from authentic and highly regarded sources Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint Except as permitted under U.S Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400 CCC is a not-for-profit organization that provides licenses and registration for a variety of users For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe Library of Congress Cataloging-in-Publication Data Names: Woodward, Wayne A | Gray, Henry L | Elliott, Alan C., 1952Title: Applied time series analysis, with R / Wayne A Woodward, Henry L Gray and Alan C Elliott Description: Second edition | Boca Raton : Taylor & Francis, 2017 | “A CRC title.” Identifiers: LCCN 2016026902 | ISBN 9781498734226 (pbk.) Subjects: LCSH: Time-series analysis | R (Computer program language) Classification: LCC QA280 W68 2017 | DDC 519.5/502855133 dc23 LC record available at https://lccn.loc.gov/2016026902 Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com and the CRC Press Web site at http://www.crcpress.com Contents Preface for Second Edition Acknowledgments Stationary Time Series 1.1 Time Series 1.2 Stationary Time Series 1.3 Autocovariance and Autocorrelation Functions for Stationary Time Series 1.4 Estimation of the Mean, Autocovariance, and Autocorrelation for Stationary Time Series 1.4.1 Estimation of μ 1.4.1.1 Ergodicity of 1.4.1.2 Variance of 1.4.2 Estimation of γk 1.4.3 Estimation of ρk 1.5 Power Spectrum 1.6 Estimating the Power Spectrum and Spectral Density for Discrete Time Series 1.7 Time Series Examples 1.7.1 Simulated Data 1.7.2 Real Data Appendix 1A: Fourier Series Appendix 1B: R Commands Exercises Linear Filters 2.1 Introduction to Linear Filters 2.1.1 Relationship between the Spectra of the Input and Output of a Linear Filter 2.2 Stationary General Linear Processes 2.2.1 Spectrum and Spectral Density for a General Linear Process 2.3 2.4 Wold Decomposition Theorem Filtering Applications 2.4.1 Butterworth Filters Appendix 2A: Theorem Poofs Appendix 2B: R Commands Exercises ARMA Time Series Models 3.1 MA Processes 3.1.1 MA(1) Model 3.1.2 MA(2) Model 3.2 AR Processes 3.2.1 Inverting the Operator 3.2.2 AR(1) Model 3.2.3 AR(p) Model for p ≥ 3.2.4 Autocorrelations of an AR(p) Model 3.2.5 Linear Difference Equations 3.2.6 Spectral Density of an AR(p) Model 3.2.7 AR(2) Model 3.2.7.1 Autocorrelations of an AR(2) Model 3.2.7.2 Spectral Density of an AR(2) 3.2.7.3 Stationary/Causal Region of an AR(2) 3.2.7.4 ψ-Weights of an AR(2) Model 3.2.8 Summary of AR(1) and AR(2) Behavior 3.2.9 AR(p) Model 3.2.10 AR(1) and AR(2) Building Blocks of an AR(p) Model 3.2.11 Factor Tables 3.2.12 Invertibility/Infinite-Order AR Processes 3.2.13 Two Reasons for Imposing Invertibility 3.3 ARMA Processes 3.3.1 Stationarity and Invertibility Conditions for an ARMA(p,q) Model 3.3.2 Spectral Density of an ARMA(p,q) Model 3.3.3 Factor Tables and ARMA(p,q) Models 3.3.4 Autocorrelations of an ARMA(p,q) Model 3.3.5 ψ-Weights of an ARMA(p,q) 3.3.6 Approximating ARMA(p,q) Processes Using High-Order AR(p) Models 3.4 Visualizing AR Components 3.5 Seasonal ARMA(p,q) × (PS,QS)S Models 3.6 Generating Realizations from ARMA(p,q) Processes 3.6.1 MA(q) Model 3.6.2 AR(2) Model 3.6.3 General Procedure 3.7 Transformations 3.7.1 Memoryless Transformations 3.7.2 AR Transformations Appendix 3A: Proofs of Theorems Appendix 3B: R Commands Exercises Other Stationary Time Series Models 4.1 Stationary Harmonic Models 4.1.1 Pure Harmonic Models 4.1.2 Harmonic Signal-Plus-Noise Models 4.1.3 ARMA Approximation to the Harmonic Signal-Plus-Noise Model 4.2 ARCH and GARCH Processes 4.2.1 ARCH Processes 4.2.1.1 The ARCH(1) Model 4.2.1.2 The ARCH(q0) Model 4.2.2 The GARCH(p0, q0) Process 4.2.3 AR Processes with ARCH or GARCH Noise Appendix 4A: R Commands Exercises Nonstationary Time Series Models 5.1 Deterministic Signal-Plus-Noise Models 5.1.1 Trend-Component Models 5.1.2 Harmonic Component Models 5.2 ARIMA(p,d,q) and ARUMA(p,d,q) Processes 5.2.1 Extended Autocorrelations of an ARUMA(p,d,q) Process 5.2.2 Cyclical Models 5.3 Multiplicative Seasonal ARUMA (p,d,q) × (Ps, Ds, Qs)s Process 5.3.1 Factor Tables for Seasonal Models of the Form of Equation 5.17 with s = and s = 12 5.4 Random Walk Models 5.4.1 Random Walk 5.4.2 Random Walk with Drift 5.5 G-Stationary Models for Data with Time-Varying Frequencies Appendix 5A: R Commands Exercises Forecasting 6.1 Mean-Square Prediction Background 6.2 Box–Jenkins Forecasting for ARMA(p,q) Models 6.2.1 General Linear Process Form of the Best Forecast Equation 6.3 Properties of the Best Forecast π-Weight Form of the Forecast Function Forecasting Based on the Difference Equation 6.5.1 Difference Equation Form of the Best Forecast Equation 6.5.2 Basic Difference Equation Form for Calculating Forecasts from an ARMA(p,q) Model 6.6 Eventual Forecast Function 6.7 Assessing Forecast Performance 6.7.1 Probability Limits for Forecasts 6.7.2 Forecasting the Last k Values 6.8 Forecasts Using ARUMA(p,d,q) Models 6.9 Forecasts Using Multiplicative Seasonal ARUMA Models 6.10 Forecasts Based on Signal-Plus-Noise Models Appendix 6A: Proof of Projection Theorem Appendix 6B: Basic Forecasting Routines Exercises 6.4 6.5 Parameter Estimation 7.1 Introduction 7.2 Preliminary Estimates 7.2.1 Preliminary Estimates for AR(p) Models 7.2.1.1 Yule–Walker Estimates 7.2.1.2 Least Squares Estimation 7.2.1.3 Burg Estimates 7.2.2 Preliminary Estimates for MA(q) Models 7.2.2.1 MM Estimation for an MA(q) 7.2.2.2 MA(q) Estimation Using the Innovations Algorithm 7.2.3 Preliminary Estimates for ARMA(p,q) Models 7.2.3.1 Extended Yule–Walker Estimates of the AR Parameters 7.2.3.2 Tsay–Tiao Estimates of the AR Parameters 7.2.3.3 Estimating the MA Parameters 7.3 ML Estimation of ARMA(p,q) Parameters 7.3.1 Conditional and Unconditional ML Estimation 7.3.2 ML Estimation Using the Innovations Algorithm 7.4 Backcasting and Estimating 7.5 Asymptotic Properties of Estimators 7.5.1 AR Case 7.5.1.1 Confidence Intervals: AR Case 7.5.2 ARMA(p,q) Case 7.5.2.1 Confidence Intervals for ARMA(p,q) Parameters 7.5.3 Asymptotic Comparisons of Estimators for an MA(1) 7.6 Estimation Examples Using Data 7.7 ARMA Spectral Estimation 7.8 ARUMA Spectral Estimation Appendix Exercises Model Identification 8.1 Preliminary Check for White Noise 8.2 Model Identification for Stationary ARMA Models 8.2.1 Model Identification Based on AIC and Related Measures 8.3 Model Identification for Nonstationary ARUMA(p,d,q) Models 8.3.1 Including a Nonstationary Factor in the Model G(λ)-stationary process, 556–573 GLC process, 574–579 M-stationary process, 549–556 Gabor spectrogram, 502–504 Gabor transform, 502–504 GARMA process, see Long memory models Gaussian process, Gaussian window, 503 Gegenbauer polynomial, 564–565 Gegenbauer process, see Long memory models Generalized ARCH (GARCH), see ARCH and GARCH process Generalized function, 29 Generalized instantaneous period and frequency, 583–584 Generalized linear chirp process, see GLC process Generalized partial autocorrelation (GPAC) array, see Model identification Generalized partial autocorrelations, 362–364 Generalized stationary process, see G-stationary process General linear process (GLP) ARMA(p,q) process, as 134 AR(p) process, as 90–94 definition, 63–64 fractional process, 458 Gegenbauer process, 465–466 MA(q) process, as 84 multivariate, 404–405 spectrum and spectral density, 65 stationary conditions, 64–65 truncated, 478, 484 Generating realizations ARMA, 155–157 GARMA, 478 GLC(p) process, 576–579, 587 GLC process, 574–575 GLC white noise, 506–508 Global temperature data, see Data sets (Hadley) GPAC array, see Model identification H Haar wavelet, 507–508 Harmonic model, stationary, 181–183 Harmonic signal-plus-noise model ARMA approximation, 187–190 stationary model, 181–183 High order AR approximation backcasting, 479 of ARMA(p,q), 146 of harmonic, 187, 190 overfit factor tables, 331–335 spectral estimation, 310–311 High-pass filter, 66 Hilbert space, 76 Hurst effect, 455 I IDWT (Inverse discrete wavelet transform), 513, 515 Impulse response function, 62, 72 Independence, testing for, 424–427 Influenza data, see Data Sets (flu) Inner product, 540 Innovations, 282, 439, 455 Innovations algorithm, 281–283, 291–292, 301–303 Instantaneous period and frequency, 485–486 Instantaneous spectrum G(λ) process, 563–571, 578–579 GLC(p) process, 587 Interest rate data, 389–400 Interpolation, 552, 560–561, 564, 578–579 Fourier, 560–561 linear, 560–561, 578 Inverse discrete wavelet transform, see IDWT Invertible/Invertibility ARMA(p,q) conditions, 136 factor tables, 137–138 long memory models, 458–460, 465–466, 469, 473–474 MA(q), 131–133 reasons for imposing, 132–133 solution, 231 Inverting the operator, 93–94 Iterated regressions, 285 K Kalman filter/recursion filtering, 433–436 h-step predictions, 434 Kalman gain, 433 missing data, 436–439 prediction, 433–436 smoothing, 433–436 time transformation, 552, 560, 578 k-factor Gegenbauer and GARMA processes, see Long memory models King Kong eats grass data, see Data Sets (kingkong) L Lavon lake data, see Data Sets (lavon) Least squares estimation, see Parameter estimation Linear chirp data, 501, 526–528 Linear chirp process, 573–574 Linear difference equation definition, 102–103 homogeneous with constant coefficients, 102–103 solution, 103 Linear filters band-pass filters, 66 band-stop/notch filters, 66 Butterworth, 69–75 causal linear filter, 61 cutoff frequency, 66–67 first difference filter, 67–68 general linear process, see General linear process input and output spectra, 63 low/high pass filters, 66 squared frequency response function, 67 sum and moving average filters, 68–69 Linear interpolation, 560–561, 578 Ljung–Box test ARMA(p,q), 376–377 multivariate, 419–420 Logarithmic time transformation, 549, 554 Log-likelihood, 288–291 Long-memory processes cyclic long memory, 464 definition, 456 FARMA process definition, 459 properties, 460 fractional process definition, 458 properties, 458 GARMA process definition, 469 properties, 469 Gegenbauer process definition, 465 properties, 466–467 k-factor Gegenbauer and GARMA process autocovariance, calculation, 476–478 CO2 data, modeling, 487–490 definition, k-factor GARMA process, 474 definition, k-factor Gegenbauer process, 473 forecasting, 484–485 generating realizations, 478 parameter estimation, 479–480 properties, kfactor GARMA process, 474 properties, kfactor Gegenbauer process, 473 testing for, 486 Lynx data, see Data Sets (lynx, llynx) M MA(1) model autocorrelations, 86–87 realization, 87 spectral density, 86–88 MA(2) model, 88–89 MA(q) models autocorrelations, 84–85 definition, 83–84 parameter estimation, see Parameter Estimation spectral density, 85 variance, 84 Maximum likelihood (ML) estimation, see Parameter estimation Mean square prediction, 230–231 Method-of-moments estimators, 276, 280–281, 301–302 Minimum 10-year temperature data, see Data Sets (freeze) Minkowski inequality, 75 Missing data, 436–439 Mixed model, 134 Model identification ARMA(p,q) models AIC and related measures AIC, 325–327 AICC, 326 BIC, 326 pattern recognition, Box-Jenkins autocorrelations, 354–357 Box-Jenkins procedure, 360–362 partial autocorrelations, 357–360 pattern recognition, corner method, 368 pattern recognition, ESACF, 368 pattern recognition, GPAC generalized partial autocorrelations, 362–364 GPAC array, 364–368 patterns, generalized partial autocorrelations, 364–365 R- and S-arrays, 368 W-statistic, 367–368 ARUMA(p,d,q) models airline data, 348 hidden spectral peaks, 350–353 nonstationary components, identifying, 331–335 overfit factor tables, 332–334 Pennsylvania temperature data, 190, 348 seasonal factors, identifying, 344–350 Tiao–Tsay theorem, 331–335 unit root testing, 341–344 k-factor GARMA(p,q), 479–483 VAR(p) models, 419–420 Mortgage data, see Data Sets (eco.mort30) Mother wavelet 2-dimensional, 534 definition, 506–507 Haar, 507 plots, 509 Moving average constant, 190 MRA, see Multiresolution analysis MRD, see Multiresolution decomposition M-stationary process continuous, 549–551 discrete, 551–552 Multiple regression, 277–278 Multiplicative seasonal ARUMA(p,d,q) model, see Seasonal models Multiresolution analysis (MRA) definition, 516–518 bumps data, 519–521 Multiresolution decomposition (MRD) definition, 517 bumps data, 519–521 Multivariate stochastic process, 399–400 N Natural frequency, see System frequency Nile river flow data, see Data Sets (nile.min) Non-ergodic process, 17 Nonstationary ARUMA(p,d,q) models, see ARIMA and ARUMA(p,d,q) models Nonstationary models ARIMA and ARUMA(p,d,q), 210–217 deterministic signal-plus-noise, 259–262 G-stationary, 221–222 random walk, 220–221 VARMA, 421–422 Norm, 540 Normality, testing for Anderson–Darling test, 380 Q-Q plot, 380 Shapiro–Wilk test, 380 Nyquist frequency, 28, 559–560 O 1/f process, 456 Observation equation, see State space models Order identification, see Model Identification Orthogonal/orthonormal functions, 540 Orthogonal/orthonormal system, 540 Orthogonal/orthonormal vectors, 540 Orthogonal polynomial, 464 Orthogonal wavelet basis, 510 Orthogonal wavelet family, 508 Overfitting, 331–335 Overfitting seasonal factors, checking for, 344–348 Tiao-Tsay theorem, 331–335 P Parameter estimation asymptotic properties of estimators, 295–302 Burg estimation AR model, 278–280 forward-backward sum of squares, 279 VAR model, 418 confidence intervals on parameters, 296–297, 300–301, 306–307 innovations algorithm maximum likelihood, ARMA, 291–292 moving average parameters, preliminary estimates, 281–284 least squares estimation forward-backward, 279 ordinary, 276–278 unconditional least squares, 288–289 maximum likelihood (ML) estimation ARMA(p,q) conditional and unconditional (exact), 286–291 innovations algorithm, 291–292 k-factor GARMA(p,q), 479–480 VAR(p), 417 method-of-moments (MM) estimation, 276, 280–281 Tsay–Tiao (TT) estimation, 284–285 white noise variance backcasting, 290 innovations algorithm, 282 Yule–Walker estimation AR model, 274–276 Durbin–Levinson algorithm, 275–276 extended Yule-Walker estimation, 283–284 multivariate, 408, 416–417 Parseval’s equality, 47–48 Partial autocorrelations, 357–360 Pattern recognition, 353–368 Pennsylvania monthly temperature data, see Data Sets (patemp) Period, 24 Periodic function, 24 Periodogram calculation formulas, 46–48 definition, 33 properties, 33 Phase spectrum, 405 π-weights, 235–236, 483–484 Positive definite, 9, 85, 104, 109, 142, 165 Positive semi-definite, 8–9, 19, 276 Power spectrum, see Spectrum and spectral density Preliminary estimates ARMA models, 283–285 AR models, 274–280 MA model, 280–283 Probability limits for forecasts, 243–244 Projection theorem, 230–231, 262–264 Pyramid algorithm, 515–516 Q Q-Q plot, 380 Quadrature spectrum, 405 R Randomness, checking for, 321–324, 375–380, 419–420 Random trend, 207–208 Random walk, 220–221 Random walk with drift, 221 Realization, Residual analysis checking for normality, 380 checking for randomness/white noise, 321–324, 375–380, 419–420 Residuals, calculating, 206, 234, 241, 280, 284–285, 293–294, 376 Roots, see Characteristic equation S Sample autocorrelations Definition, 20 Sample autocovariances, 18–19 Sample partial autocorrelations, 357–360 Sample spectrum, 32–33 Sampling and time transformation G(λ) process, 559–560 GLC process, 577–578 M-stationary process, 552 overview, 552 Scaling function, 511, 513, 531 Scalogram/Time scale plot bumps data, 526–527 chirp data, 526–528 compared with wavelet packet transform, 527–533 overview, 524–526 Schur’s lemma, 109, 165 Seasonal models ARMA, multiplicative seasonal, 149–155 ARUMA, multiplicative seasonal airline data, 217–220 definition, 217–218 factor tables, 218–219 forecasting, 255–257 model identification, 344–346 purely seasonal, 151–153 Shapiro–Wilk test, 380 Short time Fourier transform (STFT), 502–503 SIC (Schwarz information criterion), 326 Signal-plus-noise model deterministic, 205–206, 28–209 stationary, 185–187 Spectral analysis, see Spectrum and spectral density Spectral estimation ARMA spectral estimation, 309–313 ARUMA spectral estimation, 313–315 periodogram, 32–33 sample spectrum, 32–33 time varying frequency (TVF) data Gabor spectrum, 502–504 Instantaneous spectrum, 563–564, 586–587 Scalogram, 524–527 Spectrogram, 502–503 Wigner-Ville, 502, 505 VAR spectral estimation, 419 window-based (smooth) methods Bartlett, 34 using GW-WINKS, 35 Parzen, 34 Tukey, 34 Spectrum and spectral density AR(1), 96 AR(2), 109 AR(p), 105 ARMA(p,q), 136–137 covariance stationary model, 26–28 estimation, see Spectral estimation frequency domain, 22–24 general linear process, 65 MA(1), 86 MA(2), 89 MA(q), 85 multivariate, 405–406 stationary harmonic model, 184–185 stationary harmonic signal-plus-noise model, 186 VAR(p) model, 416 Squared coherency, 405 Squared frequency response 2-point sum filter, 68–69 Butterworth filter, 71 definition, 63 first difference filter, 67–68 ideal (brick wall) filter, 67 moving average smoothers, 69 State equation, see State space models State-space models additive AR components, 440–443 AR(1) model, 430 AR(1) plus noise model, 430 AR(p) and VAR(p) models, 430–431 ARMA(p,q) model, 432 description, 429–430 filtering, 433 Kalman filter, 433–439 missing data, 436–439 observation equation, 429–430 parameter estimation, 439–440 prediction, 437 smoothing, 433 state equation, 429 Stationarity covariance, 5–6 second order, strict, weak, wide sense, Stationarity conditions AR(1), 94 AR(2), 109 AR(p), 93 ARMA(p,q), 136 FARMA, 460 Fractional, 458 GARMA, 469 Gegenbauer, 466 k-factor GARMA, 474 k-factor Gegenbauer, 473 Stationary general linear process, see General linear process (GLP) Stationary harmonic models, see Harmonic models, stationary Stationary multivariate time series covariance matrix, 402 cross-correlation, 402 definition, 401–402 general linear process form, 404–405 multivariate white noise, 403–404 Stationary time series autocorrelation, 9–11 autocovariance, 7–9 definition, 5–6 estimation, autocorrelation, 20–22 autocovariance, 18–19 ergodicity, 12 mean, 12 spectrum and spectral density, 32–35, 46–48 variance, 18 spectral density, 26–28 spectrum, power, 26–28 STFT, see Short time Fourier transform Stochastic linear chirp, 575 Stochastic process definition, linear filter, 62 multivariate/vector valued, 399–401 TVF models, 547, 556, 228, 574 Sum and moving average filters, 68–69 Sunspot data, see Data Sets(ss08, ss08.1850, sunspot.classic) autocorrelation and spectral density, 41–43 modeling, 130–131, 389–394 plot, 42, 390 System frequency ARUMA, 214, 217, 260 definition, 107–108 factor tables, 124–125 forecasting, 261 MA components, 137 seasonal models, 218–219 T Temperature data, see Global temperature data Testing for independence, 424–427 Testing for trend, see Trend, testing for Testing for unit roots, see Unit roots Tiao–Tsay theorem, 331 Time scale plot, see Scalogram/Time scale plot Time series, Time transformation Box-Cox, 556 G-stationary, 548 linear chirp, 577–578 logarithmic, 549 and sampling, 552 Time varying frequency (TVF) data examples of, 222 G-stationary models, 222 overview, 499 traditional spectral analysis linear chirp signal, 501 Parzen window-based spectral density, 500 realization, 500 Transformations AR, 158–161 memoryless, 157–158 Translation index, 506–507 Trend component model, 206–208 Trend, testing for bootstrap approach, 388 Cochrane–Orcutt method, 386–389 other methods, 388–389 ψ-weights AR(1), 94 AR(2), 109–110 ARMA, calculating, 144–146 ARMA, forecast errors, 243–245 Tsay–Tiao (TT) estimation, see Parameter estimation tswge functions estimation routines est.arma.wge, 290, 316–317 est.ar.wge, 280, 290, 315–316 est.farma.wge, 480, 493–494 est.garma.wge, 480, 494–495 est.glambda.wge, 561–562, 588–589 filtering routines butterworth.wge, 74, 78, 590, 592 forecasting routines backcast.wge, 292, 294 fore.arma.wge, 241, 245, 248, 264–265 fore.aruma.wge, 255, 257, 265–266 fore.farma.wge, 484, 495 fore.garma.wge, 484, 496 fore.glambda.wge, 565, 590 fore.sigplusnoise.wge, 260, 266–268 generating realizations gen.arch.wge, 201–202 gen.arima.wge, 217, 222–223 gen.arma.wge, 89, 166–167 gen.aruma.wge, 217, 223–225 gen.garch.wge, 202 gen.garma.wge, 478, 492–493 gen.geg.wge, see gen.garma.wge gen.glambda.wge, 560, 587–588 gen.sigplusnoise.wge, 50–51, 56, 225 Kalman filtering kalman.miss.wge, 439, 450–451 kalman.wge, 436, 449–450 model identification aic.wge, 327, 368–369 aic5.wge, 327, 369–370 plotting and related calculation routines is.glambda.wge, 566, 591 is.sample.wge, 592 parzen.wge, 36, 50 period.wge, 33, 49–50 plots.parzen.wge, 521 plotts.dwt.wge, 48–49 plotts.mra.wge, 521, 542–543 plotts.sample.wge, 51–52, 371 plotts.true.wge, 89, 1689–169 true.arma.aut.wge, 89, 167–168 true.arma.spec.wge, 89, 168 true.farma.aut.wge, 478, 490–491 true.garma.aut.wge, 478, 490–491 wv.wge, 505, 541–542 transformation routines artrans.wge, 161, 171–172 trans.to.dual.wge, 589–592 trans.to.original.wge, 566, 589–590 utility routines factor.comp.wge, 149, 171, 444 factor.wge, 125, 132, 137, 152, 169–170, 410 mult.wge, 167–171 psi.weights.wge TVF data, see Time varying frequency data 2-dimensional wavelets, 534–537 U Unconditional maximum likelihood estimation, see Parameter estimation Unconditional sum of squares, 288–290, 292 Unequally spaced data, 552, 560 Unit roots and cointegration, 424, 427–429 models with, 210, 380–383 overfitting, 330–331, 350–353 testing for, 341–344 Univariate components in multivariate models, 399–401, 405–406 V VARMA(p,q) model, 412 VAR(p) model bivariate VAR(1), 407 data fitting model selection, 419 parameters estimation, 419 white noise residuals, 419–420 definition, 408 estimation, 417–418 forecasting, 414–416 forecasting Box–Jenkins style, 404–416 VAR(1) process, 414–416 spectral density estimation, 419 spectrum, 367 Yule–Walker equations, 408–409 VARUMA(p,d,q) model, 421–422 Vector space, 540 Vector-valued (multivariate) time series components, 400 covariance matrix, 401 multivariate white noise, 403–404 nonstationary VARMA processes ARIMA(p,d,q) model, 421–422 description, 421–422 stochastic process, 399 VARMA(p,q) model, 411–412 VAR(p) model, see VAR(p) model VARUMA(p,d,q) model, 371–372, 421–422 amplitude spectrum, 405 cospectrum, 405 cross spectrum, 405 phase spectrum, 405 quadrature spectrum, 405 squared coherency, 405–406 W Wage data in England, see Data Sets (wages) Wavelet analysis 2-dimensional Barbara image, 536–537 DWT, 534–535 XBOX image, 536 applications, 532–534, 537–538 approximation result, 510–512 dilates and translates, 506–508, 511 discrete wavelet transform (DWT), see Discrete wavelet transform (DWT) multiresolution analysis (MRA), 515–521 multiresolution decomposition (MRD), 515–521 scalogram/time-scale plot, 524–527 wavelet packet best basis algorithm, 531 level and transforms, 531–532 modulation parameter, 531 transform, 527–534 wavelet shrinkage, 521–524 wavelet types Daublet family, 508–509 Haar, 507–508 Symlet family, 508–509 Wavelet packet transform, see Wavelet analysis Wavelet shrinkage hard and soft threshold, 522 noisy bumps signal, 523–524 smoothing techniques, 521–523 West Texas crude oil price data, see Data Sets (wtcrude) Whale click data, see whale in data White noise, discrete, 6, 40–41 White noise variance, estimation, see Parameter estimation Wigner–Ville spectrum, 505 Window-based methods Bartlett window, 34 Gabor spectrogram, 502–504 Parzen window, 34 Tukey window, 34 Wigner-Ville spectrum, 505 Window function Gaussian window, 503 overview, 502 Wold decomposition theorem, 66 W-statistic, see Model Identification X XBOX image, 536 Y Yule–Walker equations extended, 293–284, 363–364 multivariate, 408–409, 416–417 univariate, 101, 274–276, 357–358 Yule–Walker estimation, see Parameter estimation .. .APPLIED TIME SERIES ANALYSIS WITH R Second Edition APPLIED TIME SERIES ANALYSIS WITH R Second Edition Wayne A Woodward Southern Methodist University... Preface for Second Edition Acknowledgments Stationary Time Series 1.1 Time Series 1.2 Stationary Time Series 1.3 Autocovariance and Autocorrelation Functions for Stationary Time Series 1.4 Estimation... Comprehensive Analysis of Time Series Data: A Summary Appendix 9A: R Commands Exercises 10 Vector-Valued (Multivariate) Time Series 10.1 Multivariate Time Series Basics 10.2 Stationary Multivariate Time Series