Time series econometrics (springer texts in business and economics)

408 12 0
Time series econometrics (springer texts in business and economics)

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Springer Texts in Business and Economics More information about this series at http://​www.​springer.​com/​series/​10099 Klaus Neusser Time Series Econometrics Klaus Neusser Bern, Switzerland ISSN 2192-4333 e-ISSN 2192-4341 ISBN 978-3-319-32861-4 e-ISBN 978-3-319-32862-1 DOI 10.1007/978-3-319-32862-1 Library of Congress Control Number: 2016938514 © Springer International Publishing Switzerland 2016 Springer Texts in Business and Economics This Springer imprint is published by Springer Nature This work is subject to copyright All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed The use of general descriptive names, registered names, trademarks, service marks, etc in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made Printed on acid-free paper The registered company is Springer International Publishing AG Switzerland Preface Over the past decades, time series analysis has experienced a proliferous increase of applications in economics, especially in macroeconomics and finance Today these tools have become indispensable to any empirically working economist Whereas in the beginning the transfer of knowledge essentially flowed from the natural sciences, especially statistics and engineering, to economics, over the years theoretical and applied techniques specifically designed for the nature of economic time series and models have been developed Thereby, the estimation and identification of structural vector autoregressive models, the analysis of integrated and cointegrated time series, and models of volatility have been extremely fruitful and far-reaching areas of research With the award of the Nobel Prizes to Clive W J Granger and Robert F Engle III in 2003 and to Thomas J Sargent and Christopher A Sims in 2011, the field has reached a certain degree of maturity Thus, the idea suggests itself to assemble the vast amount of material scattered over many papers into a comprehensive textbook The book is self-contained and addresses economics students who have already some prerequisite knowledge in econometrics It is thus suited for advanced bachelor, master’s, or beginning PhD students but also for applied researchers The book tries to bring them in a position to be able to follow the rapidly growing research literature and to implement these techniques on their own Although the book is trying to be rigorous in terms of concepts, definitions, and statements of theorems, not all proofs are carried out This is especially true for the more technically and lengthy proofs for which the reader is referred to the pertinent literature The book covers approximately a two-semester course in time series analysis and is divided in two parts The first part treats univariate time series, in particular autoregressive moving-average processes Most of the topics are standard and can form the basis for a one-semester introductory time series course This part also contains a chapter on integrated processes and on models of volatility The latter topics could be included in a more advanced course The second part is devoted to multivariate time series analysis and in particular to vector autoregressive processes It can be taught independently of the first part The identification, modeling, and estimation of these processes form the core of the second part A special chapter treats the estimation, testing, and interpretation of cointegrated systems The book also contains a chapter with an introduction to state space models and the Kalman filter Whereas the books is almost exclusively concerned with linear systems, the last chapter gives a perspective on some more recent developments in the context of nonlinear models I have included exercises and worked out examples to deepen the teaching and learning content Finally, I have produced five appendices which summarize important topics such as complex numbers, linear difference equations, and stochastic convergence As time series analysis has become a tremendously growing field with an active research in many directions, it goes without saying that not all topics received the attention they deserved and that there are areas not covered at all This is especially true for the recent advances made in nonlinear time series analysis and in the application of Bayesian techniques These two topics alone would justify an extra book The data manipulations and computations have been performed using the software packages EVIEWS and MATLAB Of course, there are other excellent packages available The data for the examples and additional information can be downloaded from my home page www.neusser.ch To maximize the learning success, it is advised to replicate the examples and to perform similar exercises with alternative data Interesting macroeconomic time series can, for example, be downloaded from the following home pages: Germany: www.bundesbank.de Switzerland: www.snb.ch United Kingdom: www.statistics.gov.uk United States: research.stlouisfed.org/fred2 The book grew out of lectures which I had the occasion to give over the years in Bern and other universities Thus, it is a concern to thank the many students, in particular Philip Letsch, who had to work through the manuscript and who called my attention to obscurities and typos I also want to thank my colleagues and teaching assistants Andreas Bachmann, Gregor Bäurle, Fabrice Collard, Sarah Fischer, Stephan Leist, Senada Nukic, Kurt Schmidheiny, Reto Tanner, and Martin Wagner for reading the manuscript or part of it and for making many valuable criticisms and comments Special thanks go to my former colleague and coauthor Robert Kunst who meticulously read and commented on the manuscript It goes without saying that all errors and shortcomings go to my expense Klaus Neusser Bern, Switzerland/Eggenburg, Austria February 2016 Notation and Symbols r α β number of linearly independent cointegration vectors n × r loading matrix n × r matrix of linearly independent cointegration vectors convergence in distribution convergence in mean square convergence in probability corr(X,Y) correlation coefficient beween random variables X and Y γ X ,  γ covariance function of process { X t }, covariance function ρ X ,  ρ correlation function of process { X t }, correlation function ACF autocorrelation function J long-run variance α X ,  α partial autocorrelation function of process { X t } PACF partial autocorrelation function n dimension of stochastic process, respectively dimension of state space  ∼  is distributed as sgn sign function tr trace of a matrix det determinant of a matrix  ∥ ∥  norm of a matrix ⊗ Kronecker product ⊙ Hadamard product vec( A ) stakes the columns of A into a vector vech( A ) stakes the lower triangular part of a symmetric matrix A into a vector GL( n ) general linear group of n × n matrices group of orthogonal n × n matrices L lag operator Φ (L) autoregressive polynomial Θ (L) moving-average polynomial Ψ (L) causal representation, MA( ∞ ) polynomial Δ difference operator, Δ = 1 − L p order of autoregressive polynomial q order of moving-average polynomial ARMA(p,q) autoregressive moving-average process of order ( p ,  q ) ARIMA(p,d,q) autoregressive integrated moving-average process of order ( p ,  d ,  q ) d order of integration I(d) integrated process of order d VAR(p) vector autoregressive process of order p integer numbers real numbers complex numbers set of n-dimensional vectors imaginary unit cov(X,Y) covariance beween random variables X and Y expectation operator variance operator Ψ (1) persistence linear least-squares predictor of X T + h given information from period up to period T linear least-squares predictor of X T + h using the infinite remote past up to period T P Probability { X t } stochastic process WN(0,  σ ) white noise process with mean zero and variance σ WN(0,  Σ ) multivariate white noise process with mean zero and covariance matrix Σ IID(0,  σ ) identically and independently distributed random variables with mean zero and variance σ IID N(0,  σ ) identically and independently normally distributed random variables with mean zero and variance σ X t time indexed random variable x t realization of random variable X t f ( λ ) spectral density F ( λ ) spectral distribution function I T periodogram transfer function of filter Ψ VaR value at risk Contents Part I Univariate Time Series Analysis Introduction 1.​1 Some Examples 1.​2 Formal Definitions 1.​3 Stationarity 1.​4 Construction of Stochastic Processes 1.​4.​1 White Noise 1.​4.​2 Construction of Stochastic Processes:​ Some Examples 1.​4.​3 Moving-Average Process of Order One 1.​4.​4 RandomWalk 1.​4.​5 Changing Mean 1.​5 Properties of the Autocovariance Function 1.​5.​1 Autocovariance Function of MA(1) Processes 1.​6 Exercises ARMA Models 2.​1 The Lag Operator 2.​2 Some Important Special Cases 2.2.1 Moving-Average Process of Order q 2.​2.​2 First Order Autoregressive Process 2.​3 Causality and Invertibility 2.​4 Computation of Autocovariance Function 2.​4.​1 First Procedure 2.​4.​2 Second Procedure 2.​4.​3 Third Procedure 2.​5 Exercises Forecasting Stationary Processes 3.​1 Linear Least-Squares Forecasts 3.​1.​1 Forecasting with an AR(p) Process 3.​1.​2 Forecasting with MA(q) Processes 3.​1.​3 Forecasting from the Infinite Past 3.​2 The Wold Decomposition Theorem 3.​3 Exponential Smoothing 3.​4 Exercises 3.​5 Partial Autocorrelation 3.​5.​1 Definition 3.​5.​2 Interpretation of ACF and PACF 3.​6 Exercises Estimation of Mean and ACF 4.​1 Estimation of the Mean 4.​2 Estimation of ACF 4.​3 Estimation of PACF 4.​4 Estimation of the Long-Run Variance 4.​4.​1 An Example 4.​5 Exercises Estimation of ARMA Models 5.​1 The Yule-Walker Estimator See Volatility Autoregressive final form Autoregressive moving-average process Autoregressive moving-average prozess mean B Back-shift operator See also Lag operator Bandwidth Bartlett’s formula Basic structural model cylical component local linear trend model seasonal component Bayesian VAR Beveridge-Nelson decomposition Bias proportion Bias, small sample correction BIC See Information criterion, see Information criterion Borel-Cantelli lemma Box-Pierce statistic BSM See Basic structural model C Canonical correlation coefficients Cauchy-Bunyakovskii-Schwarz inequality Causal representation Causality See also Wiener-Granger causality Wiener-Granger causality Central Limit Theorem m-dependence Characteristic function Chebyschev’s inequality Chow test Cointegration Beveridge-Nelson decomposition, bivariate common trend representation definition fully-modified OLS Granger’s representation theorem normalization order of integration shocks, permanent and transitory Smith-McMillan factorization test Johansen test regression test triangular representation VAR model assumptions VECM vector error correction Wald test Companion form Convergence Almost sure convergence Convergence in r-th mean Convergence in distribution Convergence in probability Correlation function estimator multivariate Covariance function estimator properties covariance function Covariance proportion Covariance, long-run Cross-correlation distribution, asymptotic Cyclical component D Dickey-Fuller distribution Durbin-Levinson algorithm Dynamic factor model Dynamic multiplier See Shocks, transitory E EM algorithm Ergodicity Estimation ARMA model order Estimator maximum likelihood estimator method of moments GARCH(1,1) model moment estimator OLS estimator process, integrated Yule-Walker estimator Example AD-curve and Money Supply advertisement and sales ARMA processes cointegration fully-modified OLS Johansen approach consumption expenditure and advertisement demand and supply shocks estimation of long-run variance estimation of quarterly GDP GDP and consumer sentiment index growth model, neoclassical inflation and short-term interest rate IS-LM model with Phillips curve modeling real GDP of Switzerland present discounted value model structural breaks Swiss Market Index term structure of interest rate unit root test Expectation, adaptive Exponential smoothing F Factor model, dynamic See Dynamic factor model FEVD See also Forecast error variance Decomposition Filter gain function Gibbs phenomenon high-pass Hodrick-Prescott filter HP-filter Kuznets filter low-pass phase function TRAMO-SEATS transfer function X-11 filter X-12-Filter Filter, time invariant Filtering problem Final form See Autoregressive final form FMOLS estimator Wald test Forecast error variance decomposition Forecast evaluation Bias proportion Covariance proportion Mean-absolute-error Out-of-sample strategy Root-mean-squared-error Uncertainty Variance proportion Forecast function AR(p) process ARMA(1,1) process forecast error infinite past linear MA(q) process variance of forecast error Forecast, direct Forecast, iterated Fourier frequencies Fourier transform, discrete FPE See Information criterion Frequency domain Fully-modified ordinary least-squares G Gain function Gauss Markov theorem Growth component H HAC variance See also variance, heteroskedastic and autocorrelation consistent Harmonic process Hodrick-Prescott filter HQC See Information criterion, see Information criterion I Identification Box-Jenkins Kalman filter Identification problem Impulse response function Information criterion AIC BIC Final prediction error FPE Hannan-Quinn HQC Schwarz Innovation algorithm Innovations Integrated GARCH Integrated process Integrated regressors rules of thumb Integration, order of Intercept correction Invertibility J Johansen test distribution, asymptotic hypothesis tests over β max test specification of deterministic part trace test K Kalman filter application basic structural model estimation of quarterly GDP AR(1) process assumptions causal EM algorithm filtering problem forecasting step gain matrix identification initialization likelihood function Markov property measurement errors observation equation prediction problem smoother smoothing problem stable state equation stationarity updating step Kalman smoother Kernel function bandwidth optimal rule of thumb Bartlett boxcar Daniell lag truncation parameter optimal quadratic spectral Tukey-Hanning L Lag operator calculation rules definition polynomial Lag polynomial Lag truncation parameter Lag window Lead operator Leading indicator Least-squares estimator Likelihood function ARMA process Kalman filter regime switching model Ljung-Box statistic Loading matrix definition Local linear trend model Long-run identification instrumental variables M m-dependence MA process autocorrelation function autocovariance function MAE Markov chain ergodic distribution regular Matrix norm absolute summability quadratic summability submultiplicativity Max share identification See also VAR process, VAR process Maximum likelihood estimator ARMA(p,q) model asymptotic distribution AR process ARMA(1,1) process MA process GARCH(p,q) model Maximum likelihood method Mean asymptotic distribution distribution, asymptotic estimation estimator Mean reverting Mean squared error matrix estimated coefficients known coefficients Measurement errors Median–target method Memory, short Minnesota prior Missing observations Mixture distributions Model N Normal distribution, multivariate conditional Normal equations O Observation equation Observationally equivalent OLS estimator distribution, asymptotic Order of integration Ordinary-least-squares estimator Oscillation length Overfitting P PACF See also Autocorrelation function, Partial Partial autocorrelation function computation estimation Particle filter Penalty function approach Period length Periodogram Perpetuity Persistence Phase function Portmanteau test PP-test Prediction problem Predictor See also Forecast function Present discounted value model Beveridge-Nelson decomposition cointegration spread VAR representation vector error correction model Prewhitening Process, ARIMA Process, stochastic ARMA process branching process deterministic difference-stationary finite memory finite-range dependence Gaussian process harmonic process integrated Beveridge-Nelson decomposition, forecast, long-run impulse response function OLS estimator persistence variance of forecast error linear linearly regular memory moving-average process multivariate purely non-deterministic random walk random walk with drift singular spectral representation trend-stationary forecast, long-run impulse response function variance of forecast error white noise R Random walk autocorrelation function autocovariance function Random walk with drift Real business cycle model Realization Regime switching model maximum likelihood estimation Restrictions long-run short-run sign restrictions RMSE S Seasonal component Set identified Shocks fundamental permanent structural transitory Short range dependence See also Memory, short Signal-to-noise ratio Singular values Smoothing Smoothing problem Smoothing, exponential Spectral average estimator, discrete Spectral decomposition Spectral density ARMA process autocovariance function estimator, direct estimator, indirect Fourier coefficients spectral density, rational variance, long-run Spectral distribution function Spectral representation Spectral weighting function Spectral window Spectrum estimation Spurious correlation Spurious regression State equation State space State space representation ARMA processes ARMA(1,1) missing observations stationarity time-varying coefficients Cooley-Prescott Harvey-Phillips Hildreth-Houck VAR process Stationarity multivariate strict weak Stationarity, strict multivariate Strong Law of Large Numbers Structural breaks Chow test dating of breaks tests Structural change Structural time series analysis Structural time series model basic structural model Summability absolute quadratic Summability Condition Superconsistency Swiss Market Index (SMI) T Test autocorrelation, squared residuals cointegration regression test Dickey-Fuller regression Dickey-Fuller test augmented correction, autoregressive heteroskedasticity Engle’s Lagrange-multiplier test Independence Johansen test correlation coefficients, canonical distribution, asymptotic eigenvalue problem hypotheses hypothesis tests over β likelihood function max test singular values trace test Kwiatkowski-Phillips-Schmidt-Shin-test Phillips-Perron test stationarity uncorrelatedness unit root test structural breaks testing strategy unit-root test white noise Box-Pierce statistic Ljung-Box statistic Portmanteau test Time Time domain Time series model Time-varying coefficients Minnesota prior regime switching model Times series analysis, structural Trajectory Transfer function Transfer function form Transition probability matrix U Underfitting V Value-at-Risk VaR See Value-at-Risk VAR process Bayesian VAR correlation function covariance function estimation order of VAR Yule-Walker estimator forecast error variance decomposition forecast function mean squared error form, reduced form, structural identification forecast error variance share maximization long-run identification short-run identification sign restrictions zero restrictions identification problem Cholesky decomposition impulse response function bootstrap confidence intervals delta method state space representation Structural breaks time-varying coefficients VAR(1) process stationarity variance decomposition confidence intervals Variance proportion Variance, heteroskedastic and autocorrelation consistent Variance, long-run estimation prewhitening multivariate spectral density VARMA process causal representation condition for causal representation VECM See also Cointegration Vector autoregressive moving-average process See also VARMA process Vector autoregressive process See also VAR process Volatility ARCH(p) model ARCH-in-mean model EGARCH model Forecasting GARCH(1,1) model GARCH(p,q) model ARMA process heavy-tail property GARCH(p,q) model, asymmetric heavy-tail property IGARCH models TARCH(p,q) model time-varying Wishart autoregressive process W Weighting function White noise multivariate univariate Wiener-Granger causality test F-test Haugh-Pierce test Wishart autoregressive process Wold Decomposition Theorem multivariate univariate Y Yule-Walker equations multivariate univariate Yule-Walker estimator AR(1) process asymptotic distribution MA process Footnotes Substraction and division can be defined accordingly: A more detailed introduction of complex numbers can be found in Rudin ( 1976 ) or any other mathematics textbook The notation with “ − ϕ z j ” instead of “ ϕ z j ” was chosen to conform to the notation of AR-models j j For more detailed presentations see Agarwal ( 2000 ), Elaydi ( 2005 ) or Neusser ( 2009 ) Sometimes one can find equations are then reciprocal to each other as the characteristic equation The roots are of the two characteristic ...Springer Texts in Business and Economics More information about this series at http://​www.​springer.​com/ series/ ​10099 Klaus Neusser Time Series Econometrics Klaus Neusser Bern, Switzerland... Control Number: 2016938514 © Springer International Publishing Switzerland 2016 Springer Texts in Business and Economics This Springer imprint is published by Springer Nature This work is subject... International Publishing Switzerland 2016 Klaus Neusser, Time Series Econometrics, Springer Texts in Business and Economics, DOI 10.1007/978-3-319-32862-1_1 Introduction and Basic Theoretical

Ngày đăng: 03/01/2020, 16:10

Từ khóa liên quan

Mục lục

  • Frontmatter

  • 1. Univariate Time Series Analysis

    • 1. Introduction and Basic Theoretical Concepts

    • 2. Autoregressive Moving-Average Models

    • 3. Forecasting Stationary Processes

    • 4. Estimation of the Mean and the Autocorrelation Function

    • 5. Estimation of ARMA Models

    • 6. Spectral Analysis and Linear Filters

    • 7. Integrated Processes

    • 8. Models of Volatility

    • 2. Multivariate Time Series Analysis

      • 9. Introduction

      • 10. Definitions and Stationarity

      • 11. Estimation of Mean and Covariance Function

      • 12. Stationary Time Series Models: Vector Autoregressive Moving-Average Processes ⠀嘀䄀刀䴀䄀 倀爀漀挀攀猀猀攀猀)

      • 13. Estimation of Vector Autoregressive Models

      • 14. Forecasting with VAR Models

      • 15. Interpretation and Identification of VAR Models

      • 16. Cointegration

      • 17. State-Space Models and the Kalman Filter

      • 18. Generalizations of Linear Time Series Models

      • Backmatter

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan