1. Trang chủ
  2. » Thể loại khác

Stata 11 time series reference manual

545 12 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 545
Dung lượng 7,02 MB
File đính kèm 92. Stata 11 Time-Series Reference.rar (3 MB)

Nội dung

STATA TIME-SERIES REFERENCE MANUAL RELEASE 11 A Stata Press Publication StataCorp LP College Station, Texas Copyright c 1985–2009 by StataCorp LP All rights reserved Version 11 Published by Stata Press, 4905 Lakeway Drive, College Station, Texas 77845 Typeset in TEX Printed in the United States of America 10 ISBN-10: 1-59718-063-7 ISBN-13: 978-1-59718-063-4 This manual is protected by copyright All rights are reserved No part of this manual may be reproduced, stored in a retrieval system, or transcribed, in any form or by any means—electronic, mechanical, photocopy, recording, or otherwise—without the prior written permission of StataCorp LP unless permitted by the license granted to you by StataCorp LP to use the software and documentation No license, express or implied, by estoppel or otherwise, to any intellectual property rights is granted by this document StataCorp provides this manual “as is” without warranty of any kind, either expressed or implied, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose StataCorp may make improvements and/or changes in the product(s) and the program(s) described in this manual at any time and without notice The software described in this manual is furnished under a license agreement or nondisclosure agreement The software may be copied only in accordance with the terms of the agreement It is against the law to copy the software onto DVD, CD, disk, diskette, tape, or any other medium for any purpose other than backup or archival purposes The automobile dataset appearing on the accompanying media is Copyright c 1979 by Consumers Union of U.S., Inc., Yonkers, NY 10703-1057 and is reproduced by permission from CONSUMER REPORTS, April 1979 Stata and Mata are registered trademarks and NetCourse is a trademark of StataCorp LP Other brand and product names are registered trademarks or trademarks of their respective companies For copyright information about the software, type help copyright within Stata The suggested citation for this software is StataCorp 2009 Stata: Release 11 Statistical Software College Station, TX: StataCorp LP i Table of contents intro Introduction to time-series manual time series Introduction to time-series commands arch Autoregressive conditional heteroskedasticity (ARCH) family of estimators arch postestimation Postestimation tools for arch arima ARIMA, ARMAX, and other dynamic regression models arima postestimation Postestimation tools for arima 10 43 48 71 corrgram Tabulate and graph autocorrelations cumsp Cumulative spectral distribution 77 84 dfactor Dynamic-factor models dfactor postestimation Postestimation tools for dfactor dfgls DF-GLS unit-root test dfuller Augmented Dickey–Fuller unit-root test dvech Diagonal vech multivariate GARCH models dvech postestimation Postestimation tools for dvech 87 106 111 117 122 133 fcast compute Compute dynamic forecasts of dependent variables after var, svar, or vec 137 fcast graph Graph forecasts of variables computed by fcast compute 145 haver Load data from Haver Analytics database 148 irf irf irf irf irf irf irf irf irf irf irf irf Create and analyze IRFs, dynamic-multiplier functions, and FEVDs add Add results from an IRF file to the active IRF file cgraph Combine graphs of IRFs, dynamic-multiplier functions, and FEVDs create Obtain IRFs, dynamic-multiplier functions, and FEVDs ctable Combine tables of IRFs, dynamic-multiplier functions, and FEVDs describe Describe an IRF file drop Drop IRF results from the active IRF file graph Graph IRFs, dynamic-multiplier functions, and FEVDs ograph Graph overlaid IRFs, dynamic-multiplier functions, and FEVDs rename Rename an IRF result in an IRF file set Set the active IRF file table Create tables of IRFs, dynamic-multiplier functions, and FEVDs 153 157 159 164 186 191 194 196 202 208 210 212 newey Regression with Newey–West standard errors 217 newey postestimation Postestimation tools for newey 222 pergram Periodogram pperron Phillips–Perron unit-root test prais Prais – Winsten and Cochrane – Orcutt regression prais postestimation Postestimation tools for prais 224 232 237 248 rolling Rolling-window and recursive estimation 250 sspace State-space models 258 sspace postestimation Postestimation tools for sspace 283 tsappend Add observations to a time-series dataset tsfill Fill in gaps in time variable tsline Plot time-series data tsreport Report time-series aspects of a dataset or estimation sample tsrevar Time-series operator programming command 290 296 300 305 308 ii tsset Declare data to be time-series data tssmooth Smooth and forecast univariate time-series data tssmooth dexponential Double-exponential smoothing tssmooth exponential Single-exponential smoothing tssmooth hwinters Holt–Winters nonseasonal smoothing tssmooth ma Moving-average filter tssmooth nl Nonlinear filter tssmooth shwinters Holt–Winters seasonal smoothing 311 327 329 335 343 350 355 357 var intro Introduction to vector autoregressive models var Vector autoregressive models var postestimation Postestimation tools for var var svar Structural vector autoregressive models var svar postestimation Postestimation tools for svar varbasic Fit a simple VAR and graph IRFS or FEVDs varbasic postestimation Postestimation tools for varbasic vargranger Perform pairwise Granger causality tests after var or svar varlmar Perform LM test for residual autocorrelation after var or svar varnorm Test for normally distributed disturbances after var or svar varsoc Obtain lag-order selection statistics for VARs and VECMs varstable Check the stability condition of VAR or SVAR estimates varwle Obtain Wald lag-exclusion statistics after var or svar vec intro Introduction to vector error-correction models vec Vector error-correction models vec postestimation Postestimation tools for vec veclmar Perform LM test for residual autocorrelation after vec vecnorm Test for normally distributed disturbances after vec vecrank Estimate the cointegrating rank of a VECM vecstable Check the stability condition of VECM estimates 366 373 385 388 408 411 416 419 423 426 432 438 443 448 467 491 494 497 501 509 wntestb Bartlett’s periodogram-based test for white noise 513 wntestq Portmanteau (Q) test for white noise 518 xcorr Cross-correlogram for bivariate time series 521 Glossary 525 Subject and author index 531 iii Cross-referencing the documentation When reading this manual, you will find references to other Stata manuals For example, [U] 26 Overview of Stata estimation commands [R] regress [D] reshape The first example is a reference to chapter 26, Overview of Stata estimation commands, in the User’s Guide; the second is a reference to the regress entry in the Base Reference Manual; and the third is a reference to the reshape entry in the Data-Management Reference Manual All the manuals in the Stata Documentation have a shorthand notation: [GSM] [GSU] [GSW] [U ] [R] [D ] [G ] [XT] [MI] [MV] [P ] [SVY] [ST] [TS] [I] Getting Started with Stata for Mac Getting Started with Stata for Unix Getting Started with Stata for Windows Stata User’s Guide Stata Base Reference Manual Stata Data-Management Reference Manual Stata Graphics Reference Manual Stata Longitudinal-Data/Panel-Data Reference Manual Stata Multiple-Imputation Reference Manual Stata Multivariate Statistics Reference Manual Stata Programming Reference Manual Stata Survey Data Reference Manual Stata Survival Analysis and Epidemiological Tables Reference Manual Stata Time-Series Reference Manual Stata Quick Reference and Index [M ] Mata Reference Manual Detailed information about each of these manuals may be found online at http://www.stata-press.com/manuals/ Title intro — Introduction to time-series manual Description This entry describes this manual and what has changed since Stata 10 Remarks This manual documents Stata’s time-series commands and is referred to as [TS] in cross-references After this entry, [TS] time series provides an overview of the ts commands The other parts of this manual are arranged alphabetically If you are new to Stata’s time-series features, we recommend that you read the following sections first: [TS] time series [TS] tsset Introduction to time-series commands Declare a dataset to be time-series data Stata is continually being updated, and Stata users are always writing new commands To ensure that you have the latest features, you should install the most recent official update; see [R] update What’s new New estimation command sspace estimates linear state-space models by maximum likelihood In state-space models, the dependent variables are linear functions of unobserved states and observed exogenous variables A few of the many models are VARMA models, structural time-series models, some linear dynamic models, and some stochastic general-equilibrium models sspace can estimate the parameters of most linear time-series models with time-invariant parameters because they can be cast as state-space models sspace can estimate stationary and nonstationary models For stationary models, sspace uses the Kalman filter to estimate the observed states For nonstationary models, sspace uses the De Jong diffuse Kalman filter See [TS] sspace New estimation command dvech estimates diagonal vech multivariate GARCH models These models allow the conditional variance matrix of the dependent variables to follow a flexible dynamic structure in which each element of the current conditional variance matrix depends on its own past and on past shocks See [TS] dvech New estimation command dfactor estimates dynamic-factor models These multivariate timeseries models allow the dependent variables and the unobserved factor variables to have vector autoregressive (VAR) structures and to be linear functions of exogenous variables See [TS] dfactor Estimation commands newey, prais, sspace, dvech, and dfactor allow Stata’s new factorvariable varlist notation; see [U] 11.4.3 Factor variables Also, these estimation commands allow the standard set of factor-variable–related reporting options; see [R] estimation options New postestimation command margins, which calculates marginal means, predictive margins, marginal effects, and average marginal effects, is available after all time-series estimation commands, except svar See [R] margins New display option vsquish for estimation commands, which allows you to control the spacing in output containing time-series operators or factor variables, is available after all time-series estimation commands See [R] estimation options intro — Introduction to time-series manual New display option coeflegend for estimation commands, which displays the coefficients’ legend showing how to specify them in an expression, is available after all time-series estimation commands See [R] estimation options predict after regress now allows time-series operators in option dfbeta(); see [R] regress postestimation Also allowing time-series operators are regress postestimation commands estat szroeter, estat hettest, avplot, and avplots See [R] regress postestimation Existing estimation commands mlogit, ologit, and oprobit now allow time-series operators; see [R] mlogit, [R] ologit, and [R] oprobit 10 Existing estimation commands arch and arima now accept maximization option showtolerance; see [R] maximize 11 Existing estimation command arch now allows you to fit models assuming that the disturbances follow Student’s t distribution or the generalized error distribution, as well as the Gaussian (normal) distribution Specify which distribution to use with option distribution() You can specify the shape or degree-of-freedom parameter, or you can let arch estimate it along with the other parameters of the model See [TS] arch 12 Existing command tsappend is now faster See [TS] tsappend For a complete list of all the new features in Stata 11, see [U] 1.3 What’s new Also see [U] 1.3 What’s new [R] intro — Introduction to base reference manual Title time series — Introduction to time-series commands Description The Time-Series Reference Manual organizes the commands alphabetically, making it easy to find individual command entries if you know the name of the command This overview organizes and presents the commands conceptually, that is, according to the similarities in the functions that they perform The commands listed under the heading Data-management tools and time-series operators help you prepare your data for further analysis The commands listed under the heading Univariate time series are grouped together because they are either estimators or filters designed for univariate time series or preestimation or postestimation commands that are conceptually related to one or more univariate time-series estimators The commands listed under the heading Multivariate time series are similarly grouped together because they are either estimators designed for use with multivariate time series or preestimation or postestimation commands conceptually related to one or more multivariate time-series estimators Within these three broad categories, similar commands have been grouped together (Continued on next page) 524 xcorr — Cross-correlogram for bivariate time series Methods and formulas xcorr is implemented as an ado-file The cross-covariance function of lag k for time series x1 and x2 is given by Cov x1 (t), x2 (t + k) = R12 (k) This function is not symmetric about lag zero; that is, R12 (k) = R12 (−k) We define the cross-correlation function as ρij (k) = Corr xi (t), xj (t + k) = Rij (k) Rii (0)Rjj (0) where ρ11 and ρ22 are the autocorrelation functions for x1 and x2 , respectively The sequence ρ12 (k) is the cross-correlation function and is drawn for lags k ∈ (−Q, −Q + 1, , −1, 0, 1, , Q − 1, Q) If ρ12 (k) = for all lags, x1 and x2 are not cross-correlated References Box, G E P., G M Jenkins, and G C Reinsel 1994 Time Series Analysis: Forecasting and Control 3rd ed Englewood Cliffs, NJ: Prentice–Hall Hamilton, J D 1994 Time Series Analysis Princeton: Princeton University Press Newton, H J 1988 TIMESLAB: A Time Series Analysis Laboratory Belmont, CA: Wadsworth Also see [TS] tsset — Declare data to be time-series data [TS] corrgram — Tabulate and graph autocorrelations [TS] pergram — Periodogram Glossary 525 Glossary ARCH model An autoregressive conditional heteroskedasticity (ARCH) model is a regression model in which the conditional variance is modeled as an autoregressive (AR) process The ARCH(m) model is yt = x t β + t E( 2t | 2 t−1 , t−2 , ) = α0 + α1 t−1 + · · · + αm t−m where t is a white-noise error term The equation for yt represents the conditional mean of the process, and the equation for E( 2t | 2t−1 , 2t−2 , ) specifies the conditional variance as an autoregressive function of its past realizations Although the conditional variance changes over time, the unconditional variance is time invariant because yt is a stationary process Modeling the conditional variance as an AR process raises the implied unconditional variance, making this model particularly appealing to researchers modeling fat-tailed data, such as financial data ARIMA model An autoregressive integrated moving-average (ARIMA) model is a time-series model suitable for use with integrated processes In an ARIMA(p, d, q) model, the data is differenced d times to obtain a stationary series, and then an ARMA(p, q) model is fit to this differenced data ARIMA models that include exogenous explanatory variables are known as ARMAX models ARMA model An autoregressive moving-average (ARMA) model is a time-series model in which the current period’s realization is the sum of an autoregressive (AR) process and a moving-average (MA) process An ARMA(p, q) model includes p AR terms and q MA terms ARMA models with just a few lags are often able to fit data, as well as pure AR or MA models with many more lags ARMAX model An ARMAX model is a time-series model in which the current period’s realization is an ARMA process plus a linear function of a set of a exogenous variables Equivalently, an ARMAX model is a linear regression model in which the error term is specified to follow an ARMA process autocorrelation function The autocorrelation function (ACF) expresses the correlation between periods t and t − k of a time series as function of the time t and the lag k For a stationary time series, the ACF does not depend on t and is symmetric about k = 0, meaning that the correlation between periods t and t − k is equal to the correlation between periods t and t + k autoregressive process An autoregressive process is a time-series model in which the current value of a variable is a linear function of its own past values and a white-noise error term A first-order autoregressive process, denoted as an AR(1) process, is yt = ρyt−1 + t An AR(p) model contains p lagged values of the dependent variable Cholesky ordering Cholesky ordering is a method used to orthogonalize the error term in a VAR or VECM to impose a recursive structure on the dynamic model, so that the resulting impulse–response functions can be given a causal interpretation The method is so named because it uses the Cholesky decomposition of the error covariance matrix Cochrane–Orcutt estimator This estimation is a linear regression estimator that can be used when the error term exhibits first-order autocorrelation An initial estimate of the autocorrelation parameter ρ is obtained from OLS residuals, and then OLS is performed on the transformed data yt = yt − ρyt−1 and xt = xt − ρxt−1 cointegrating vector A cointegrating vector specifies a stationary linear combination of nonstationary variables Specifically, if each of the variables x1 , x2 , , xk is integrated of order one and there exists a set of parameters β1 , β2 , , βk such that zt = β1 x1 + β2 x2 + · · · + βk xk is a stationary process, the variables x1 , x2 , , xk are said to be cointegrated, and the vector β is known as a cointegrating vector 526 Glossary conditional variance Although the conditional variance is simply the variance of a conditional distribution, in time-series analysis the conditional variance is often modeled as an autoregressive process, giving rise to ARCH models correlogram A correlogram is a table or graph showing the sample autocorrelations or partial autocorrelations of a time series covariance stationarity A process is covariance stationary if the mean of the process is finite and independent of t, the unconditional variance of the process is finite and independent of t, and the covariance between periods t and t − s is finite and depends on t − s but not on t or s themselves Covariance-stationary processes are also known as weakly stationary processes cross-correlation function The cross-correlation function expresses the correlation between one series at time t and another series at time t − k as a function of the time t and lag k If both series are stationary, the function does not depend on t The function is not symmetric about k = 0: ρ12 (k) = ρ12 (−k) difference operator The difference operator ∆ denotes the change in the value of a variable from period t − to period t Formally, ∆yt = yt − yt−1 , and ∆2 yt = ∆(yt − yt−1 ) = (yt − yt−1 ) − (yt−1 − yt−2 ) = yt − 2yt−1 + yt−2 dynamic forecast A dynamic forecast is one in which the current period’s forecast is calculated using forecasted values for prior periods dynamic-multiplier function A dynamic-multiplier function measures the effect of a shock to an exogenous variable on an endogenous variable The k th dynamic-multiplier function of variable i on variable j measures the effect on variable j in period t + k in response to a one-unit shock to variable i in period t, holding everything else constant endogenous variable An endogenous variable is a regressor that is correlated with the unobservable error term Equivalently, an endogenous variable is one whose values are determined by the equilibrium or outcome of a structural model exogenous variable An exogenous variable is one that is correlated with none of the unobservable error terms in the model Equivalently, an exogenous variable is one whose values change independently of the other variables in a structural model exponential smoothing Exponential smoothing is a method of smoothing a time series in which the smoothed value at period t is equal to a fraction α of the series value at time t plus a fraction − α of the previous period’s smoothed value The fraction α is known as the smoothing parameter forecast-error variance decomposition Forecast-error variance decompositions measure the fraction of the error in forecasting variable i after h periods that is attributable to the orthogonalized shocks to variable j forward operator The forward operator F denotes the value of a variable at time t + Formally, F yt = yt+1 , and F yt = F yt+1 = yt+2 frequency-domain analysis Frequency-domain analysis is analysis of time-series data by considering its frequency properties The spectral density and distribution functions are key components of frequency-domain analysis, so it is often called spectral analysis In Stata, the cumsp and pergram commands are used to analyze the sample spectral distribution and density functions, respectively Glossary 527 GARCH model A generalized autoregressive conditional heteroskedasticity (GARCH) model is a regression model in which the conditional variance is modeled as an ARMA process The GARCH(m, k) model is yt = xt β + t σt2 = γ0 + γ1 t−1 + · · · + γm t−m 2 + δ1 σt−1 + · · · + δk σt−k where the equation for yt represents the conditional mean of the process and σt represents the conditional variance See [TS] arch or Hamilton (1994, chap 21) for details on how the conditional variance equation can be viewed as an ARMA process GARCH models are often used because the ARMA specification often allows the conditional variance to be modeled with fewer parameters than are required by a pure ARCH model Many extensions to the basic GARCH model exist; see [TS] arch for those that are implemented in Stata See also ARCH model generalized least-squares estimator A generalized least-squares (GLS) estimator is used to estimate the parameters of a regression function when the error term is heteroskedastic or autocorrelated In the linear case, GLS is sometimes described as “OLS on transformed data” because the GLS estimator can be implemented by applying an appropriate transformation to the dataset and then using OLS Granger causality The variable x is said to Granger-cause variable y if, given the past values of y, past values of x are useful for predicting y Holt–Winters smoothing A set of methods for smoothing time-series data that assume that the value of a time series at time t can be approximated as the sum of a mean term that drifts over time, as well as a time trend whose strength also drifts over time Variations of the basic method allow for seasonal patterns in data, as well impulse–response function An impulse–response function (IRF) measures the effect of a shock to an endogenous variable on itself or another endogenous variable The k th impulse–response function of variable i on variable j measures the effect on variable j in period t + k in response to a one-unit shock to variable i in period t, holding everything else constant independent and identically distributed A series of observations is independently and identically distributed (i.i.d.) if each observation is an independent realization from the same underlying distribution In some contexts, the definition is relaxed to mean only that the observations are independent and have identical means and variances; see Davidson and MacKinnon (1993, 42) integrated process A nonstationary process is integrated of order d, written I(d), if the process must be differenced d times to produce a stationary series An I(1) process yt is one in which ∆yt is stationary Kalman filter The Kalman filter is a recursive procedure for predicting the state vector in a state-space model lag operator The lag operator L denotes the value of a variable at time t − Formally, Lyt = yt−1 , and L2 yt = Lyt−1 = yt−2 moving-average process A moving-average process is a time-series process in which the current value of a variable is modeled as a weighted average of current and past realizations of a whitenoise process and, optionally, a time-invariant constant By convention, the weight on the current realization of the white-noise process is equal to one, and the weights on the past realizations are known as the moving-average (MA) coefficients A first-order moving-average process, denoted as an MA(1) process, is yt = θ t−1 + t Newey–West covariance matrix The Newey–West covariance matrix is a member of the class of heteroskedasticity- and autocorrelation-consistent (HAC) covariance matrix estimators used with 528 Glossary time-series data that produces covariance estimates that are robust to both arbitrary heteroskedasticity and autocorrelation up to a prespecified lag orthogonalized impulse–response function An orthogonalized impulse–response function (OIRF) measures the effect of an orthogonalized shock to an endogenous variable on itself or another endogenous variable An orthogonalized shock is one that affects one variable at time t but no other variables See [TS] irf create for a discussion of the difference between IRFs and OIRFs partial autocorrelation function The partial autocorrelation function (PACF) expresses the correlation between periods t and t − k of a time series as a function of the time t and lag k , after controlling for the effects of intervening lags For a stationary time series, the PACF does not depend on t The PACF is not symmetric about k = 0: the partial autocorrelation between yt and yt−k is not equal to the partial autocorrelation between yt and yt+k periodogram A periodogram is a graph of the spectral density function of a time series as a function of frequency The pergram command first standardizes the amplitude of the density by the sample variance of the time series, and then plots the logarithm of that standardized density Peaks in the periodogram represent cyclical behavior in the data Portmanteau statistic The portmanteau, or Q, statistic is used to test for white noise and is calculated using the first m autocorrelations of the series, where m is chosen by the user Under the null hypothesis that the series is a white-noise process, the portmanteau statistic has a χ2 distribution with m degrees of freedom Prais–Winsten estimator A Prais–Winsten estimator is a linear regression estimator that is used when the error term exhibits first-order autocorrelation; see also Cochrane–Orcutt estimator Here the first observation in the dataset is transformed as y1 = − ρ2 y1 and x1 = − ρ2 x1 , so that the first observation is not lost The Prais–Winsten estimator is a generalized least-squares estimator priming values Priming values are the initial, preestimation values used to begin a recursive process random walk A random walk is a time-series process in which the current period’s realization is equal to the previous period’s realization plus a white-noise error term: yt = yt−1 + t A random walk with drift also contains a nonzero time-invariant constant: yt = δ + yt−1 + t The constant term δ is known as the drift parameter An important property of random-walk processes is that the best predictor of the value at time t + is the value at time t plus the value of the drift parameter recursive regression analysis A recursive regression analysis involves performing a regression at time t by using all available observations from some starting time t0 through time t, performing another regression at time t + by using all observations from time t0 through time t + 1, and so on Unlike a rolling regression analysis, the first period used for all regressions is held fixed rolling regression analysis A rolling, or moving window, regression analysis involves performing regressions for each period by using the most recent m periods’ data, where m is known as the window size At time t the regression is fit using observations for times t − 19 through time t; at time t + the regression is fit using the observations for time t − 18 through t + 1; and so on seasonal difference operator The period-s seasonal difference operator ∆s denotes the difference in the value of a variable at time t and time t − s Formally, ∆s yt = yt − yt−s , and ∆2s yt = ∆s (yt − yt−s ) = (yt − yt−s ) − (yt−s − yt−2s ) = yt − 2yt−s + yt−2s serial correlation Serial correlation refers to regression errors that are correlated over time If a regression model does not contained lagged dependent variables as regressors, the OLS estimates are consistent in the presence of mild serial correlation, but the covariance matrix is incorrect When the model includes lagged dependent variables and the residuals are serially correlated, the Glossary 529 OLS estimates are biased and inconsistent See, for example, Davidson and MacKinnon (1993, chap 10) for more information serial correlation tests Because OLS estimates are at least inefficient and potentially biased in the presence of serial correlation, econometricians have developed many tests to detect it Popular ones include the Durbin–Watson (1950, 1951, 1971) test, the Breusch–Pagan (1980) test, and Durbin’s (1970) alternative test See [R] regress postestimation time series smoothing Smoothing a time series refers to the process of extracting an overall trend in the data The motivation behind smoothing is the belief that a time series exhibits a trend component as well as an irregular component and that the analyst is interested only in the trend component Some smoothers also account for seasonal or other cyclical patterns spectral analysis See frequency-domain analysis spectral density function The spectral density function is the derivative of the spectral distribution function Intuitively, the spectral density function f (ω) indicates the amount of variance in a time series that is attributable to sinusoidal components with frequency ω See also spectral distribution function The spectral density function is sometimes called the spectrum spectral distribution function The (normalized) spectral distribution function F (ω) of a process describes the proportion of variance that can be explained by sinusoids with frequencies in the range (0, ω), where ≤ ω ≤ π The spectral distribution and density functions used in frequencydomain analysis are closely related to the autocorrelation function used in time-domain analysis; see Chatfield (2004, chap 6) spectrum See spectral density function state-space model A state-space model describes the relationship between an observed time series and an unobservable state vector that represents the “state” of the world The measurement equation expresses the observed series as a function of the state vector, and the transition equation describes how the unobserved state vector evolves over time By defining the parameters of the measurement and transition equations appropriately, one can write a wide variety of time-series models in the state-space form steady-state equilibrium The steady-state equilibrium is the predicted value of a variable in a dynamic model, ignoring the effects of past shocks, or, equivalently, the value of a variable, assuming that the effects of past shocks have fully died out and no longer affect the variable of interest strict stationarity A process is strictly stationary if the joint distribution of y1 , , yk is the same as the joint distribution of y1+τ , , yk+τ for all k and τ Intuitively, shifting the origin of the series by τ units has no effect on the joint distributions structural model In time-series analysis, a structural model is one that describes the relationship among a set of variables, based on underlying theoretical considerations Structural models may contain both endogenous and exogenous variables SVAR A structural vector autoregressive (SVAR) model is a type of VAR in which short- or long-run constraints are placed on the resulting impulse–response functions The constraints are usually motivated by economic theory and therefore allow causal interpretations of the IRFs to be made time-domain analysis Time-domain analysis is analysis of data viewed as a sequence of observations observed over time The autocorrelation function, linear regression, ARCH models, and ARIMA models are common tools used in time-domain analysis unit-root process A unit-root process is one that is integrated of order one, meaning that the process is nonstationary but that first-differencing the process produces a stationary series The simplest example of a unit-root process is the random walk See Hamilton (1994, chap 15) for a discussion of when general ARMA processes may contain a unit root 530 Glossary unit-root tests Whether a process has a unit root has both important statistical and economic ramifications, so a variety of tests have been developed to test for them Among the earliest tests proposed is the one by Dickey and Fuller (1979), though most researchers now use an improved variant called the augmented Dickey–Fuller test instead of the original version Other common unit-root tests implemented in Stata include the DF–GLS test of Elliot, Rothenberg, and Stock (1996) and the Phillips–Perron (1988) test See [TS] dfuller, [TS] dfgls, and [TS] pperron Variants of unit-root tests suitable for panel data have also been developed; see [XT] xtunitroot VAR A vector autoregressive (VAR) model is a multivariate regression technique in which each dependent variable is regressed on lags of itself and on lags of all the other dependent variables in the model Occasionally, exogenous variables are also included in the model VECM A vector error-correction model (VECM) is a type of VAR that is used with variables that are cointegrated Although first-differencing variables that are integrated of order one makes them stationary, fitting a VAR to such first-differenced variables results in misspecification error if the variables are cointegrated See The multivariate VECM specification in [TS] vec intro for more on this point white noise A variable ut represents a white-noise process if the mean of ut is zero, the variance of ut is σ , and the covariance between ut and us is zero for all s = t Gaussian white noise refers to white noise in which ut is normally distributed Yule–Walker equations The Yule–Walker equations are a set of difference equations that describe the relationship among the autocovariances and autocorrelations of an autoregressive moving-average (ARMA) process References Breusch, T S., and A R Pagan 1980 The Lagrange multiplier test and its applications to model specification in econometrics Review of Economic Studies 47: 239–253 Chatfield, C 2004 The Analysis of Time Series: An Introduction 6th ed Boca Raton, FL: Chapman & Hall/CRC Davidson, R., and J G MacKinnon 1993 Estimation and Inference in Econometrics New York: Oxford University Press Dickey, D A., and W A Fuller 1979 Distribution of the estimators for autoregressive time series with a unit root Journal of the American Statistical Association 74: 427–431 Durbin, J 1970 Testing for serial correlation in least-squares regressions when some of the regressors are lagged dependent variables Econometrica 38: 410–421 Durbin, J., and G S Watson 1950 Testing for serial correlation in least squares regression I Biometrika 37: 409–428 1951 Testing for serial correlation in least squares regression II Biometrika 38: 159–177 1971 Testing for serial correlation in least squares regression III Biometrika 58: 1–19 Elliott, G., T J Rothenberg, and J H Stock 1996 Efficient tests for an autoregressive unit root Econometrica 64: 813–836 Hamilton, J D 1994 Time Series Analysis Princeton: Princeton University Press Phillips, P C B., and P Perron 1988 Testing for a unit root in time series regression Biometrika 75: 335–346 Subject and author index 531 Subject and author index This is the subject and author index for the Time-Series Reference Manual Readers interested in topics other than time series should see the combined subject index (and the combined author index) in the Quick Reference and Index The combined index indexes the Getting Started manuals, the User’s Guide, and all the reference manuals except the Mata Reference Manual Semicolons set off the most important entries from the rest Sometimes no entry will be set off with semicolons, meaning that all entries are equally important A Abraham, B., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters ac command, [TS] corrgram add, irf subcommand, [TS] irf add Adkins, L C., [TS] arch Ahn, S K., [TS] vec intro Akaike, H., [TS] varsoc Amemiya, T., [TS] varsoc Amisano, G., [TS] irf create, [TS] var intro, [TS] var svar, [TS] vargranger, [TS] varwle Anderson, B D O., [TS] sspace Anderson, T W., [TS] vec, [TS] vecrank Ansley, C F., [TS] arima A-PARCH, [TS] arch AR, [TS] dfactor, [TS] sspace ARCH effects, estimation, [TS] arch model, [TS] Glossary postestimation, [TS] arch postestimation regression, [TS] arch arch command, [TS] arch, [TS] arch postestimation ARCH models, [TS] dvech ARIMA postestimation, [TS] arima postestimation regression, [TS] arima arima command, [TS] arima, [TS] arima postestimation ARMA, [TS] arch, [TS] arima, [TS] dfactor, [TS] Glossary, [TS] sspace ARMAX, [TS] dfactor, [TS] sspace ARMAX model, [TS] arima, [TS] Glossary autocorrelation, [TS] arch, [TS] arima, [TS] corrgram, [TS] dfactor, [TS] Glossary, [TS] newey, [TS] prais, [TS] sspace, [TS] var, [TS] varlmar autoregressive conditional heteroskedasticity, [TS] arch integrated moving average, [TS] arch, [TS] arima moving average, [TS] arch, [TS] arima process, [TS] Glossary autoregressive conditional heteroskedasticity, [TS] dvech autoregressive model, [TS] dfactor, [TS] sspace autoregressive moving average model, [TS] dfactor, [TS] sspace Aznar, A., [TS] vecrank B Bartlett, M S., [TS] wntestb Bartlett’s bands, [TS] corrgram periodogram test, [TS] wntestb Baum, C F., [TS] arch, [TS] arima, [TS] dfgls, [TS] rolling, [TS] time series, [TS] tsset, [TS] var, [TS] wntestq Bauwens, L., [TS] dvech Becketti, S., [TS] corrgram Bera, A K., [TS] arch, [TS] varnorm, [TS] vecnorm Berkes, I., [TS] dvech Berndt, E K., [TS] arch, [TS] arima Black, F., [TS] arch block exogeneity, [TS] vargranger Bollerslev, T., [TS] arch, [TS] arima, [TS] dvech Boswijk, H P., [TS] vec Bowerman, B L., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Box, G E P., [TS] arima, [TS] corrgram, [TS] cumsp, [TS] dfuller, [TS] pergram, [TS] pperron, [TS] wntestq, [TS] xcorr Breusch, T S., [TS] Glossary Brockwell, P J., [TS] corrgram, [TS] sspace C Caines, P E., [TS] sspace Casals, J., [TS] sspace cgraph, irf subcommand, [TS] irf cgraph Chang, Y., [TS] sspace Chatfield, C., [TS] arima, [TS] corrgram, [TS] Glossary, [TS] pergram, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth ma, [TS] tssmooth shwinters, [TS] tssmooth Chatterjee, S., [TS] prais Cheung, Y.-W., [TS] dfgls Cholesky ordering, [TS] Glossary Chou, R Y., [TS] arch Christiano, L J., [TS] irf create, [TS] var svar Chu-Chun-Lin, S., [TS] sspace clock time, [TS] tsset cluster estimator of variance, Prais–Winsten and Cochrane–Orcutt regression, [TS] prais Cochrane, D., [TS] prais Cochrane–Orcutt regression, [TS] Glossary, [TS] prais cointegration, [TS] fcast compute, [TS] fcast graph, [TS] Glossary, [TS] vec, [TS] vec intro, [TS] veclmar, [TS] vecnorm, [TS] vecrank, [TS] vecstable 532 Subject and author index compute, fcast subcommand, [TS] fcast compute Comte, F., [TS] dvech conditional variance, [TS] arch, [TS] Glossary constrained estimation ARCH, [TS] arch ARIMA and ARMAX, [TS] arima structural vector autoregressive models, [TS] var svar vector autoregressive models, [TS] var vector error-correction models, [TS] vec constrained optimization, [TS] arch, [TS] arima, [TS] dfactor, [TS] dvech, [TS] sspace, [TS] var correlogram, [TS] corrgram, [TS] Glossary corrgram command, [TS] corrgram covariance stationarity, [TS] Glossary Cox, N J., [TS] tssmooth hwinters, [TS] tssmooth shwinters create, irf subcommand, [TS] irf create cross-correlation function, [TS] Glossary, [TS] xcorr cross-correlogram, [TS] xcorr ctable, irf subcommand, [TS] irf ctable cumsp command, [TS] cumsp cumulative spectral distribution, empirical, [TS] cumsp D data manipulation, [TS] tsappend, [TS] tsfill, [TS] tsreport, [TS] tsrevar, [TS] tsset David, J S., [TS] arima Davidson, R., [TS] arch, [TS] arima, [TS] Glossary, [TS] prais, [TS] sspace, [TS] varlmar Davis, R A., [TS] corrgram, [TS] sspace De Jong, P., [TS] dfactor, [TS] sspace, [TS] sspace postestimation DeGroot, M H., [TS] arima Deistler, M., [TS] sspace describe, irf subcommand, [TS] irf describe dexponential, tssmooth subcommand, [TS] tssmooth dexponential dfactor command, [TS] dfactor dfgls command, [TS] dfgls dfuller command, [TS] dfuller diagonal vech GARCH model, [TS] dvech Dickens, R., [TS] prais Dickey, D A., [TS] dfuller, [TS] Glossary, [TS] pperron Dickey–Fuller test, [TS] dfgls, [TS] dfuller Diebold, F X., [TS] arch difference operator, [TS] Glossary Diggle, P J., [TS] arima, [TS] wntestq Ding, Z., [TS] arch Doornik, J A., [TS] vec double-exponential smoothing, [TS] tssmooth dexponential drop, irf subcommand, [TS] irf drop Drukker, D M., [TS] sspace, [TS] vec Durbin, J., [TS] Glossary, [TS] prais Durbin–Watson statistic, [TS] prais Durlauf, S N., [TS] vec, [TS] vec intro, [TS] vecrank dvech command, [TS] dvech dynamic-factor model, [TS] dfactor, [TS] dfactor postestimation, [TS] sspace dynamic forecast, [TS] fcast compute, [TS] fcast graph, [TS] Glossary dynamic-multiplier function, [TS] Glossary, [TS] irf, [TS] irf cgraph, [TS] irf create, [TS] irf ctable, [TS] irf ograph, [TS] irf table, [TS] var intro dynamic regression model, [TS] arima, [TS] var dynamic structural simultaneous equations, [TS] var svar E EGARCH, [TS] arch Eichenbaum, M., [TS] irf create, [TS] var svar eigenvalue stability condition, [TS] varstable, [TS] vecstable Elliott, G., [TS] dfgls, [TS] Glossary Enders, W., [TS] arch, [TS] arima, [TS] corrgram endogenous variable, [TS] Glossary Engle, R F., [TS] arch, [TS] arima, [TS] dfactor, [TS] dvech, [TS] vec, [TS] vec intro, [TS] vecrank Evans, C L., [TS] irf create, [TS] var svar exogenous variable, [TS] Glossary exp list, [TS] rolling exponential smoothing, [TS] Glossary, [TS] tssmooth, [TS] tssmooth exponential exponential, tssmooth subcommand, [TS] tssmooth exponential F factor model, [TS] dfactor fcast compute command, [TS] fcast compute fcast graph command, [TS] fcast graph feasible generalized least squares, see FGLS Feller, W., [TS] wntestb FEVD, [TS] Glossary, [TS] irf, [TS] irf create, [TS] irf ograph, [TS] irf table, [TS] var intro, [TS] varbasic, [TS] vec intro FGLS, [TS] dfgls, [TS] prais, [TS] var filters, [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth ma, [TS] tssmooth nl, [TS] tssmooth shwinters Flannery, B P., [TS] arch, [TS] arima forecast, [TS] dfactor postestimation, [TS] dvech postestimation, [TS] sspace postestimation, see forecasting forecast-error variance decomposition, see FEVD forecasting, [TS] arch, [TS] arima, [TS] fcast compute, [TS] fcast graph, [TS] irf create, [TS] tsappend, [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth ma, [TS] tssmooth shwinters, [TS] var, [TS] var intro, [TS] vec, [TS] vec intro Subject and author index 533 forward operator, [TS] Glossary frequency-domain analysis, [TS] cumsp, [TS] Glossary, [TS] pergram Friedman, M., [TS] arima Fuller, W A., [TS] dfuller, [TS] Glossary, [TS] pperron G Gani, J., [TS] wntestb GARCH, [TS] arch, [TS] Glossary garch command, [TS] arch GARCH Models, [TS] dvech Gardiner, J S., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Gardner Jr., E S., [TS] tssmooth dexponential, [TS] tssmooth hwinters generalized autoregressive conditional heteroskedasticity, [TS] arch, [TS] dvech generalized least-squares estimator, [TS] Glossary Geweke, J., [TS] dfactor Giannini, C., [TS] irf create, [TS] var intro, [TS] var svar, [TS] vargranger, [TS] varwle Giles, D E A., [TS] prais GJR, [TS] arch Glosten, L R., [TS] arch Gonzalo, J., [TS] vec intro, [TS] vecrank Gourieroux, C., [TS] arima Granger, C W J., [TS] arch, [TS] vargranger, [TS] vec, [TS] vec intro, [TS] vecrank Granger causality, [TS] Glossary, [TS] vargranger graph, fcast subcommand, [TS] fcast graph graph, irf subcommand, [TS] irf graph graphs, autocorrelations, [TS] corrgram correlogram, [TS] corrgram cross-correlogram, [TS] xcorr cumulative spectral density, [TS] cumsp forecasts, [TS] fcast graph impulse–response functions, [TS] irf, [TS] irf cgraph, [TS] irf graph, [TS] irf ograph partial correlogram, [TS] corrgram periodogram, [TS] pergram white-noise test, [TS] wntestb Greene, W H., [TS] arch, [TS] arima, [TS] corrgram, [TS] var Griffiths, W E., [TS] arch, [TS] prais H Hadi, A S., [TS] prais Hall, B H., [TS] arch, [TS] arima Hall, R E., [TS] arch, [TS] arima Hamilton, J D., [TS] arch, [TS] arima, [TS] corrgram, [TS] dfuller, [TS] fcast compute, [TS] Glossary, [TS] irf, [TS] irf create, [TS] pergram, [TS] pperron, [TS] sspace postestimation, [TS] sspace, [TS] time series, [TS] var, [TS] var intro, [TS] var svar, [TS] vargranger, [TS] varnorm, [TS] varsoc, [TS] varstable, [TS] varwle, [TS] vec, [TS] vec intro, [TS] vecnorm, [TS] vecrank, [TS] vecstable, [TS] xcorr Hannan, E J., [TS] sspace Hardin, J W., [TS] newey, [TS] prais Harvey, A C., [TS] arch, [TS] arima, [TS] prais, [TS] sspace, [TS] sspace postestimation, [TS] var svar Hausman, J A., [TS] arch, [TS] arima Haver Analytics, [TS] haver haver command, [TS] haver heteroskedasticity conditional, [TS] arch multiplicative heteroskedastic regression, [TS] arch Higgins, M L., [TS] arch Hildreth, C., [TS] prais Hildreth–Lu regression, [TS] prais Hill, R C., [TS] arch, [TS] prais Hipel, K W., [TS] arima Holt, C C., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Holt–Winters smoothing, [TS] Glossary, [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Horv´ath, L., [TS] dvech Hubrich, K., [TS] vec intro, [TS] vecrank hwinters, tssmooth subcommand, [TS] tssmooth hwinters I impulse–response functions, see IRF independent and identically distributed, [TS] Glossary information criterion, [TS] varsoc innovation accounting, [TS] irf integrated process, [TS] Glossary IRF, [TS] Glossary, [TS] irf, [TS] irf add, [TS] irf cgraph, [TS] irf create, [TS] irf ctable, [TS] irf describe, [TS] irf drop, [TS] irf graph, [TS] irf ograph, [TS] irf rename, [TS] irf set, [TS] irf table, [TS] var intro, [TS] varbasic, [TS] vec intro cumulative impulse–response functions, [TS] irf create cumulative orthogonalized impulse–response functions, [TS] irf create orthogonalized impulse–response functions, [TS] irf create irf add command, [TS] irf add irf cgraph command, [TS] irf cgraph 534 Subject and author index irf irf irf irf irf irf irf irf irf irf commands, introduction, [TS] irf create command, [TS] irf create ctable command, [TS] irf ctable describe command, [TS] irf describe drop command, [TS] irf drop graph command, [TS] irf graph ograph command, [TS] irf ograph rename command, [TS] irf rename set command, [TS] irf set table command, [TS] irf table J Jagannathan, R., [TS] arch Jarque, C M., [TS] varnorm, [TS] vecnorm Jarque–Bera statistic, [TS] varnorm, [TS] vecnorm Jenkins, G M., [TS] arima, [TS] corrgram, [TS] cumsp, [TS] dfuller, [TS] pergram, [TS] pperron, [TS] xcorr Jerez, M., [TS] sspace Johansen, S., [TS] irf create, [TS] varlmar, [TS] vec, [TS] vec intro, [TS] veclmar, [TS] vecnorm, [TS] vecrank, [TS] vecstable Johnson, L A., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Judge, G G., [TS] arch, [TS] prais K Kalman, R E., [TS] arima Kalman filter, [TS] arima, [TS] dfactor, [TS] dfactor postestimation, [TS] Glossary, [TS] sspace, [TS] sspace postestimation forecast, [TS] dfactor postestimation, [TS] sspace postestimation smoothing, [TS] dfactor postestimation, [TS] sspace postestimation Kim, I.-M., [TS] vec, [TS] vec intro, [TS] vecrank King, M L., [TS] prais King, R G., [TS] vecrank Kmenta, J., [TS] arch, [TS] prais, [TS] rolling Kohn, R., [TS] arima Kroner, K F., [TS] arch kurtosis, [TS] varnorm, [TS] vecnorm L lag-exclusion statistics, [TS] varwle lag operator, [TS] Glossary lag-order selection statistics, [TS] varsoc; [TS] var, [TS] var intro, [TS] var svar, [TS] vec intro Lagrange-multiplier test, [TS] varlmar, [TS] veclmar Lai, K S., [TS] dfgls Laurent, S., [TS] dvech leap seconds, [TS] tsset Ledolter, J., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Lee, T.-C., [TS] arch, [TS] prais Lieberman, O., [TS] dvech Lilien, D M., [TS] arch Lim, G C., [TS] arch linear regression, [TS] newey, [TS] prais, [TS] var, [TS] var intro, [TS] var svar, [TS] varbasic Ljung, G M., [TS] wntestq Lu, J Y., [TS] prais Lăutkepohl, H., [TS] arch, [TS] dfactor, [TS] dvech, [TS] fcast compute, [TS] irf, [TS] irf create, [TS] prais, [TS] sspace, [TS] sspace postestimation, [TS] time series, [TS] var, [TS] var intro, [TS] var svar, [TS] varbasic, [TS] vargranger, [TS] varnorm, [TS] varsoc, [TS] varstable, [TS] varwle, [TS] vec intro, [TS] vecnorm, [TS] vecrank, [TS] vecstable M MA, [TS] arch, [TS] arima, [TS] dfactor, [TS] sspace ma, tssmooth subcommand, [TS] tssmooth ma MacKinnon, J G., [TS] arch, [TS] arima, [TS] dfuller, [TS] Glossary, [TS] pperron, [TS] prais, [TS] sspace, [TS] varlmar Maddala, G S., [TS] vec, [TS] vec intro, [TS] vecrank Magnus, J R., [TS] var svar Mandelbrot, B., [TS] arch Mangel, M., [TS] varwle maximum likelihood, [TS] var, [TS] var intro, [TS] var svar, [TS] varbasic McCullough, B D., [TS] corrgram McDowell, A W., [TS] arima McLeod, A I., [TS] arima Meiselman, D., [TS] arima Miller, J I., [TS] sspace Monfort, A., [TS] arima Montgomery, D C., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Moore, J B., [TS] sspace moving average, [TS] arch, [TS] arima, [TS] Glossary, [TS] tssmooth, [TS] tssmooth ma multiplicative heteroskedasticity, [TS] arch multivariate ARCH, [TS] dvech postestimation multivariate GARCH, [TS] dvech, [TS] dvech postestimation multivariate time series, [TS] dfactor, [TS] dvech, [TS] sspace, [TS] var, [TS] var intro, [TS] var svar, [TS] varbasic, [TS] vec, [TS] vec intro, [TS] xcorr N NARCH, [TS] arch Subject and author index 535 NARCHK, [TS] arch Nelson, D B., [TS] arch, [TS] arima, [TS] dvech Neudecker, H., [TS] var svar Newbold, P., [TS] arima, [TS] vec intro Newey, W K., [TS] newey, [TS] pperron newey command, [TS] newey, [TS] newey postestimation Newey–West covariance matrix, [TS] Glossary postestimation, [TS] newey postestimation regression, [TS] newey Newton, H J., [TS] arima, [TS] corrgram, [TS] cumsp, [TS] dfuller, [TS] pergram, [TS] wntestb, [TS] xcorr Ng, S., [TS] dfgls Nielsen, B., [TS] varsoc, [TS] vec intro nl, tssmooth subcommand, [TS] tssmooth nl nonlinear estimation, [TS] arch smoothing, [TS] tssmooth nl nonstationary time series, [TS] dfgls, [TS] dfuller, [TS] pperron, [TS] vec intro, [TS] vec normality test after VAR or SVAR, [TS] varnorm after VEC, [TS] vecnorm NPARCH, [TS] arch Pierce, D A., [TS] wntestq Pisati, M., [TS] time series Pitarakis, J.-Y., [TS] vecrank Plosser, C I., [TS] vecrank portmanteau statistic, [TS] Glossary portmanteau test, [TS] corrgram, [TS] wntestq postestimation command, [TS] fcast compute, [TS] fcast graph, [TS] irf, [TS] vargranger, [TS] varlmar, [TS] varnorm, [TS] varsoc, [TS] varstable, [TS] varwle, [TS] veclmar, [TS] vecnorm, [TS] vecstable pperron command, [TS] pperron Prais, S J., [TS] prais prais command, [TS] Glossary, [TS] prais, [TS] prais postestimation Prais–Winsten regression, see prais command predict, [TS] dvech postestimation, [TS] sspace postestimation predict command=predict command, [TS] dfactor postestimation Press, W H., [TS] arch, [TS] arima priming values, [TS] Glossary O R O’Connell, R T., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters ograph, irf subcommand, [TS] irf ograph Olkin, I., [TS] wntestb Orcutt, G H., [TS] prais orthogonalized impulse–response function, [TS] Glossary, [TS] irf, [TS] var intro, [TS] vec, [TS] vec intro Osterwald-Lenum, M., [TS] vecrank random walk, [TS] Glossary recursive estimation, [TS] rolling recursive regression analysis, [TS] Glossary Reinsel, G C., [TS] arima, [TS] corrgram, [TS] cumsp, [TS] dfuller, [TS] pergram, [TS] pperron, [TS] vec intro, [TS] xcorr rename, irf subcommand, [TS] irf rename Robins, R P., [TS] arch robust, Huber/White/sandwich estimator of variance ARCH, [TS] arch ARIMA and ARMAX, [TS] arima Newey–West regression, [TS] newey Prais–Winsten and Cochrane–Orcutt regression, [TS] prais rolling command, [TS] rolling rolling regression, [TS] Glossary, [TS] rolling Rombouts, J V K., [TS] dvech Room, T., [TS] arima Rothenberg, T J., [TS] dfgls, [TS] Glossary, [TS] sspace, [TS] var svar, [TS] vec Runkle, D E., [TS] arch P pac command, [TS] corrgram Pagan, A R., [TS] Glossary PARCH, [TS] arch Park, J Y., [TS] sspace, [TS] vec, [TS] vec intro, [TS] vecrank partial autocorrelation function, [TS] corrgram, [TS] Glossary Paulsen, J., [TS] varsoc, [TS] vec intro pergram command, [TS] pergram periodogram, [TS] Glossary, [TS] pergram Perron, P., [TS] dfgls, [TS] Glossary, [TS] pperron Phillips, P C B., [TS] Glossary, [TS] pperron, [TS] vargranger, [TS] vec, [TS] vec intro, [TS] vecrank Phillips–Perron test, [TS] pperron Q Q statistic, [TS] wntestq S SAARCH, [TS] arch Saikkonen, P., [TS] vec intro, [TS] vecrank Salvador, M., [TS] vecrank Samaniego, F J., [TS] varwle Sargan, J D., [TS] prais 536 Subject and author index Sargent, T J., [TS] dfactor Schneider, W., [TS] sspace Schwert, G W., [TS] dfgls seasonal ARIMA, [TS] tssmooth, [TS] tssmooth shwinters smoothing, [TS] tssmooth, [TS] tssmooth shwinters seasonal difference operator, [TS] Glossary selection-order statistics, [TS] varsoc Serfling, R J., [TS] irf create serial correlation, see autocorrelation set, irf subcommand, [TS] irf set Shumway, R H., [TS] arima shwinters, tssmooth subcommand, [TS] tssmooth shwinters Silvennoinen, A., [TS] dvech Sims, C A., [TS] dfactor, [TS] irf create, [TS] var svar, [TS] vec, [TS] vec intro, [TS] vecrank skewness, [TS] varnorm smoothers, [TS] Glossary, [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth ma, [TS] tssmooth nl, [TS] tssmooth shwinters Sotoca, S., [TS] sspace spectral distribution, [TS] cumsp, [TS] Glossary, [TS] pergram spectrum, [TS] Glossary Sperling, R., [TS] arch, [TS] arima, [TS] dfgls, [TS] wntestq sspace command, [TS] sspace stability, [TS] var, [TS] var intro, [TS] var svar, [TS] vecstable after VAR or SVAR, [TS] varstable after VEC, [TS] vec, [TS] vec intro state-space model, [TS] arima, [TS] dfactor, [TS] dfactor postestimation, [TS] Glossary, [TS] sspace, [TS] sspace postestimation stationary time series, [TS] dfgls, [TS] dfuller, [TS] pperron, [TS] var, [TS] var intro, [TS] vec, [TS] vec intro steady-state equilibrium, [TS] Glossary Stock, J H., [TS] arch, [TS] dfactor, [TS] dfgls, [TS] Glossary, [TS] irf create, [TS] rolling, [TS] sspace, [TS] time series, [TS] var, [TS] var intro, [TS] var svar, [TS] vec, [TS] vec intro, [TS] vecrank strict stationarity, [TS] Glossary structural time-series model, [TS] sspace structural vector autoregression, see SVAR SUR, [TS] dfactor SVAR, [TS] Glossary, [TS] var intro, [TS] var svar postestimation, [TS] var svar postestimation; [TS] fcast compute, [TS] fcast graph, [TS] irf, [TS] irf create, [TS] vargranger, [TS] varlmar, [TS] varnorm, [TS] varsoc, [TS] varstable, [TS] varwle svar command, [TS] var svar, [TS] var svar postestimation T tables, [TS] irf ctable, [TS] irf table TARCH, [TS] arch Terăasvirta, T., [TS] dvech test, Dickey–Fuller, [TS] dfgls, [TS] dfuller granger causality, [TS] vargranger Lagrange-multiplier, [TS] varlmar, [TS] veclmar normality, [TS] varnorm, [TS] vecnorm Wald, [TS] vargranger, [TS] varwle Teukolsky, S A., [TS] arch, [TS] arima Theil, H., [TS] prais time-domain analysis, [TS] arch, [TS] arima, [TS] Glossary time-series analysis, [TS] pergram operators, [TS] tsset time-varying variance, [TS] arch tsappend command, [TS] tsappend Tsay, R S., [TS] varsoc, [TS] vec intro tsfill command, [TS] tsfill tsline command, [TS] tsline tsreport command, [TS] tsreport tsrevar command, [TS] tsrevar tsrline command, [TS] tsline tsset command, [TS] tsset tssmooth commands, introduction, [TS] tssmooth tssmooth dexponential command, [TS] tssmooth dexponential tssmooth exponential command, [TS] tssmooth exponential tssmooth hwinters command, [TS] tssmooth hwinters tssmooth ma command, [TS] tssmooth ma tssmooth nl command, [TS] tssmooth nl tssmooth shwinters command, [TS] tssmooth shwinters U unit-root models, [TS] vec, [TS] vec intro process, [TS] Glossary test, [TS] dfgls, [TS] dfuller, [TS] Glossary, [TS] pperron univariate time series, [TS] arch, [TS] arima, [TS] newey, [TS] prais unobserved-component model, [TS] dfactor, [TS] sspace V VAR, [TS] dfactor, [TS] Glossary, [TS] sspace, [TS] var, [TS] var intro, [TS] var svar, [TS] varbasic Subject and author index 537 VAR, continued postestimation, [TS] fcast compute, [TS] fcast graph, [TS] irf, [TS] irf create, [TS] var postestimation, [TS] vargranger, [TS] varlmar, [TS] varnorm, [TS] varsoc, [TS] varstable, [TS] varwle var command, [TS] var, [TS] var postestimation varbasic command, [TS] varbasic, [TS] varbasic postestimation vargranger command, [TS] vargranger variance decompositions, see FEVD varlmar command, [TS] varlmar varnorm command, [TS] varnorm varsoc command, [TS] varsoc varstable command, [TS] varstable varwle command, [TS] varwle vec command, [TS] vec, [TS] vec postestimation veclmar command, [TS] veclmar VECM, [TS] dvech, [TS] Glossary, [TS] vec, [TS] vec intro postestimation, [TS] fcast compute, [TS] fcast graph, [TS] irf, [TS] irf create, [TS] varsoc, [TS] vec postestimation, [TS] veclmar, [TS] vecnorm, [TS] vecrank, [TS] vecstable vecnorm command, [TS] vecnorm vecrank command, [TS] vecrank vecstable command, [TS] vecstable vector autoregression, see VAR vector autoregressive forecast, [TS] fcast compute, [TS] fcast graph vector autoregressive moving average model, [TS] dfactor, [TS] sspace vector error-correction model, see VECM Vetterling, W T., [TS] arch, [TS] arima W Wald, A., [TS] varwle Wald test, [TS] vargranger, [TS] varwle Watson, G S., [TS] Glossary, [TS] prais Watson, M W., [TS] arch, [TS] dfactor, [TS] dfgls, [TS] irf create, [TS] rolling, [TS] sspace, [TS] time series, [TS] var, [TS] var intro, [TS] var svar, [TS] vec, [TS] vec intro, [TS] vecrank weighted moving average, [TS] tssmooth, [TS] tssmooth ma West, K D., [TS] newey, [TS] pperron White, H., [TS] newey, [TS] prais white noise, [TS] Glossary white-noise test, [TS] wntestb, [TS] wntestq Wiggins, V L., [TS] arch, [TS] arima, [TS] sspace Winsten, C B., [TS] prais Winters, P R., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters wntestb command, [TS] wntestb wntestq command, [TS] wntestq Wolfowitz, J., [TS] varwle Wooldridge, J M., [TS] arch, [TS] dvech, [TS] prais X xcorr command, [TS] xcorr Y Yar, M., [TS] tssmooth, [TS] tssmooth dexponential, [TS] tssmooth exponential, [TS] tssmooth hwinters, [TS] tssmooth shwinters Yule–Walker equations, [TS] corrgram, [TS] Glossary Z Zakoian, J M., [TS] arch Zellner, A., [TS] prais ... Longitudinal-Data/Panel-Data Reference Manual Stata Multiple-Imputation Reference Manual Stata Multivariate Statistics Reference Manual Stata Programming Reference Manual Stata Survey Data Reference Manual Stata Survival... Started with Stata for Unix Getting Started with Stata for Windows Stata User’s Guide Stata Base Reference Manual Stata Data-Management Reference Manual Stata Graphics Reference Manual Stata Longitudinal-Data/Panel-Data... Epidemiological Tables Reference Manual Stata Time- Series Reference Manual Stata Quick Reference and Index [M ] Mata Reference Manual Detailed information about each of these manuals may be found

Ngày đăng: 02/09/2021, 11:01