1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

foundations of econometrics phần 10 potx

70 201 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 70
Dung lượng 2,02 MB

Nội dung

14.5 Cointegration 619 Estimation Using an ECM We mentioned in Section 13.4 that an error correction model can be used even when the data are nonstationary. In order to justify this assertion, we start again from the simplest case, in which the two series y t1 and y t2 are generated by the two equations (14.45). From the definition (14.41) of the I(0) process v t2 , we have ∆v t2 = (λ 2 − 1)v t−1,2 + e t2 . (14.49) We may invert equations (14.45) as follows: v t1 = x 11 y t1 + x 12 y t2 , and v t2 = x 21 y t1 + x 22 y t2 , (14.50) where x ij is the ij th element of the inverse X −1 of the matrix with typical element x ij . If we use the expression for v t2 and its first difference given by equations (14.50), then equation (14.49) becomes x 21 ∆y t1 = −x 22 ∆y t2 + (λ 2 − 1)(x 21 y t−1,1 + x 22 y t−1,2 ) + e t2 . Dividing by x 21 and noting that the relation between the inverse matrices implies that x 21 x 11 + x 22 x 21 = 0, we obtain the error-correction model ∆y t1 = η 2 ∆y t2 + (λ 2 − 1)(y t−1,1 − η 2 y t−1,2 ) + e  t2 , (14.51) where, as above, η 2 = x 11 /x 21 is the second component of the cointegrating vector, and e  t2 = e t2 /x 21 . Although the notation is somewhat different from that used in Section 13.3, it is easy enough to see that equation (14.51) is a special case of an ECM like (13.62). Notice that it must be estimated by nonlinear least squares. In general, equation (14.51) is an unbalanced regression, because it mixes the first differences, which are I(0), with the levels, which are I(1). But the linear combination y t−1,1 − η 2 y t−1,2 is I(0), on account of the cointegration of y t1 and y t2 . The term (λ 2 − 1)(y t−1,1 − η 2 y t−1,2 ) is precisely the error-correction term of this ECM. Indeed, y t−1,1 − η 2 y t−1,2 is the equilibrium error, and it influences ∆y t1 through the negative coefficient λ 2 − 1. The parameter η 2 appears twice in (14.51), once in the equilibrium error, and once as the coefficient of ∆y t2 . The implied restriction is a consequence of the very special structure of the DGP (14.45). It is the parameter that appears in the equilibrium error that defines the cointegrating vector, not the coefficient of ∆y t2 . This follows because it is the equilibrium error that defines the long-run relationship linking y t1 and y t2 , whereas the coefficient of ∆y t2 is a short-run multiplier, determining the immediate impact of a change in y t2 on y t1 . It is usually thought to be too restrictive to require that the long-run and short-run multipliers should be the same, and so, for the purposes of estimation and testing, equation (14.51) is normally replaced by ∆y t1 = α∆y t2 + δ 1 y t−1,1 + δ 2 y t−1,2 + e t , (14.52) Copyright c  1999, Russell Davidson and James G. MacKinnon 620 Unit Roots and Cointegration where the new parameter α is the short-run multiplier, δ 1 = λ 2 − 1, and δ 2 = (1 − λ 2 )η 2 . Since (14.52) is just a linear regression, the parameter of interest, which is η 2 , can be estimated by ˆη 2 ≡ − ˆ δ 2 / ˆ δ 1 , using the OLS estimates of δ 1 and δ 2 . Equation (14.52) is without doubt an unbalanced regression, and so we must expect that the OLS estimates will not have their usual distributions. It turns out that ˆη 2 is a super-consistent estimator of η 2 . In fact, it is usually less biased than the estimate obtained from the simple regression of y t2 on y t1 , as readers are invited to check by simulation in Exercise 14.20. In the general case, with k cointegrated variables, we may estimate the coint- egrating vector using the linear regression ∆y t = X t γ + ∆Y t2 α + δy t−1 + Y t−1,2 δ 2 + e t , (14.53) where, as before, X t is a vector of deterministic regressors, γ is the associated parameter vector, Y t = [y t Y t2 ] is a 1 × k vector, δ is a scalar, and α and δ 2 are both (k − 1) vectors. Regression (14.52) is evidently a special case of regression (14.53). The super-consistent ECM estimator of η 2 is then the ratio of the OLS estimator ˆ α to the OLS estimator ˆ δ. Other approaches When we cannot, or do not want to, specify an ECM, at least two other methods are available for estimating a cointegrating vector. One, proposed by Phillips and Hansen (1990), is called fully modified estimation. The idea is to modify the OLS estimate of η 2 in equation (14.44) by subtracting an estimate of the bias. The result turns out to be asymptotically multivariate normal, and it is possible to estimate its asymptotic covariance matrix. To explain just how fully modified estimation works would require more space than we have available. Interested readers should consult the original paper or Banerjee, Dolado, Galbraith, and Hendry (1993, Chapter 7). A second approach, which is due to Saikkonen (1991), is much simpler to describe and implement. We run the regression y t = X t γ + Y t2 η 2 + p  j=−p ∆Y t+j,2 δ j + ν t (14.54) by OLS. Observe that regression (14.54) is just regression (14.44) with the addition of p leads and p lags of the first differences of Y t2 . As with augmented Dickey-Fuller tests, the idea is to add enough leads and lags so that the error terms appear to be serially independent. Provided that p is allowed to increase at the appropriate rate as n → ∞, this regression yields estimates that are asymptotically efficient. Copyright c  1999, Russell Davidson and James G. MacKinnon 14.5 Cointegration 621 Inference in Regressions with I(1) Variables From what we have said so far, it might seem that standard asymptotic results never apply when a regression contains one or more regressors that are I(1). This is true for spurious regressions like (14.12), for unit root test regressions like (14.18), and for error-correction models like (14.52). In all these cases, certain statistics that are computed as ordinary t statistics actually follow nonstandard distributions asymptotically. However, it is not true that the t statistic on every parameter in a regression that involves I(1) variables follows a nonstandard distribution asymptotic- ally. It is not even true that the t statistic on every coefficient of an I(1) variable follows such a distribution. Instead, as Sims, Stock, and Watson (1990) showed in a famous paper, the t statistic on any parameter that ap- pears only as the coefficient of an I(0) variable, perhaps after the regressors are rearranged, follows the standard normal distribution asymptotically. Sim- ilarly, an F statistic for a test of the hypothesis that any set of parameters is zero follows its usual asymptotic distribution if all the parameters can be written as coefficients of I(0) variables at the same time. On the other hand, t statistics and F statistics corresponding to parameters that do not satisfy this condition generally follow nonstandard limiting distributions, although there are certain exceptions that we will not discuss here; see West (1988) and Sims, Stock, and Watson (1990). We will not attempt to prove these results, which are by no means trivial. Proofs may be found in the original paper by Sims et al., and there is a some- what simpler discussion in Banerjee, Dolado, Galbraith, and Hendry (1993, Chapter 6). Instead, we will consider two examples that should serve to illus- trate the nature of the results. First, consider a simple ECM reparametrized as equation (14.52). When y t1 and y t2 are not cointegrated, it is impossible to arrange things so that δ 1 is the coefficient of an I(0) variable. Therefore, the t statistic for δ 1 = 0 follows a nonstandard distribution asymptotically. However, when y t1 and y t2 are cointegrated, the quantity y t−1,1 − η 2 y t−1,2 is I(0). In this case, therefore, δ 1 is the coefficient of an I(0) variable, and the t statistic for δ 1 = d 1 is asymptotically distributed as N(0, 1), if the true value of δ 1 is the negative number d 1 . We can rewrite equation (14.52) as ∆y t1 = α∆y t2 − δ 2 (η 1 y t−1,1 − y t−1,2 ) + e t , (14.55) where η 1 = 1/η 2 = −δ 1 /δ 2 . In equation (14.55), δ 2 is written as the coefficient of a variable that is I(0) if y t1 and y t2 are cointegrated. It follows that the t statistic for a test that δ 2 is equal to its true (presumably positive) value is asymptotically distributed as N(0, 1). We have just seen that, when y t1 and y t2 are cointegrated, equation (14.52) can be rewritten is such a way that either δ 1 or δ 2 is the coefficient of an I(0) variable. Consequently, the t statistic on every coefficient in (14.52) is Copyright c  1999, Russell Davidson and James G. MacKinnon 622 Unit Roots and Cointegration asymptotically normally distributed. Despite this, it is not the case that an F statistic for a test concerning both δ 1 and δ 2 follows its usual asymptotic distribution under the null hypothesis. This is because we cannot rewrite (14.52) so that both δ 1 and δ 2 are coefficients of I(0) variables at the same time. Indeed, if ˆ δ 1 and ˆ δ 2 were jointly asymptotically normal, the ratio ˆη 2 would also be asymptotically normal, with the same rate of convergence, in contradiction of the result that ˆη 2 is super-consistent. It is not obvious how it is possible for both ˆ δ 1 and ˆ δ 2 to be asymptotic- ally normal, with the usual root-n rate of convergence, while the ratio ˆη 2 is super-consistent. The phenomenon is explained by the fact, which we will not attempt to demonstrate in detail here, that the two random variables n 1/2 ( ˆ δ 1 − δ 1 )/δ 1 and n 1/2 ( ˆ δ 2 − δ 2 )/δ 2 tend as n → ∞ to exactly the same random variable, and so differ only at order n −1/2 . The two variables are therefore perfectly correlated asymptotically. It is straightforward (see Exer- cise 14.21) to show that this implies that −ˆη 2 = ˆ δ 2 ˆ δ 1 = δ 2 δ 1 + O(n −1 ). (14.56) This result expresses the super-consistency of ˆη 2 . As a second example, consider the augmented Dickey-Fuller test regression ∆y t = γ + β  y t−1 + δ 1 ∆y t−1 + e t , (14.57) which is a special case of equation (14.32). This can be rewritten as ∆y t = γ + β  y t−1 + δ 1 y t−1 − δ 1 y t−2 + e t = γ + β  (y t−1 − y t−2 ) + δ 1 y t−1 + (β  − δ 1 )y t−2 + e t . (14.58) When y t is I(1), we cannot write this regression in such a way that β  is the coefficient of an I(0) variable. In the second line of (14.58), it does multiply such a variable, since y t−1 − y t−2 is I(0), but it also multiplies y t−2 , which is I(1). Thus we may expect that the t statistic for β  = 0 has a nonstandard asymptotic distribution. As we saw in Section 14.3, that is indeed the case, since it follows the Dickey-Fuller τ c distribution graphed in Figure 14.2. On the other hand, because ∆y t−1 is I(0), the t statistic for δ 1 = 0 in equation (14.57) does follow the standard normal distribution asymptotically. More- over, F tests for the coefficients of more than one lag of ∆y t to be jointly zero also yield statistics that follow the usual asymptotic F distribution. That is why we can validly use standard tests to decide how many lags of ∆y t−1 to include in the test regression (14.33) that is used to perform augmented Dickey-Fuller tests. Copyright c  1999, Russell Davidson and James G. MacKinnon 14.5 Cointegration 623 Estimation by a Vector Autoregression The procedures we have discussed so far for estimating and making inferences about cointegrating vectors are all in essence single-equation methods. A very popular alternative to those methods is to estimate a vector autoregression, or VAR, for all of the possibly cointegrated variables. The best-known such methods were introduced by Johansen (1988, 1991) and initially applied by Johansen and Juselius (1990, 1992), and a similar approach was introduced independently by Ahn and Reinsel (1988, 1990). Johansen (1995) provides a detailed exposition. An advantage of these methods is that they can allow for more than one cointegrating relation among a set of more than two variables. Consider the VAR Y t = X t B + p+1  i=1 Y t−i Φ i + U t , (14.59) where Y t is a 1 × g vector of observations on the levels of a set of variables, each of which is assumed to be I(1), X t (which may or may not be present) is a row vector of deterministic variables, such as a constant term and a trend, B is a matrix of coefficients of those deterministic regressors, U t is a 1 × g vector of error terms, and the Φ i are g × g matrices of coefficients. The VAR (14.59) is written in levels. It can be reparametrized as ∆Y t = X t B + Y t−1 Π + p  i=1 ∆Y t−i Γ i + U t , (14.60) where it is not difficult to verify that Γ p = −Φ p+1 , Γ i = Γ i+1 − Φ i+1 for i = 1, . . . , p, and Π = p+1  i=1 Φ i − I g . Equation (14.60) is the multivariate analog of the augmented Dickey-Fuller test regression (14.33). In that regression, we tested the null hypothesis of a unit root by testing whether the coefficient of y t−1 is 0. In very much the same way, we can test whether and to what extent the variables in Y t are cointegrated by testing hypotheses about the g × g matrix Π, which is called the impact matrix. If we assume, as usual, that the differenced variables are I(0), then everything in equation (14.60) except the term Y t−1 Π is I(0). Therefore, if the equation is to be satisfied, this term must be I(0) as well. It clearly is so if the matrix Π is a zero matrix. In this extreme case, there is no cointegration at all. However, it can also be I(0) if Π is nonzero but does not have full rank. In fact, the rank of Π is the number of cointegrating relations. To see why this is so, suppose that the matrix Π has rank r, with 0 ≤ r < g. In this case, we can always write Π = η α  , (14.61) Copyright c  1999, Russell Davidson and James G. MacKinnon 624 Unit Roots and Cointegration where η and α are both g × r matrices. Recall that the rank of a matrix is the number of linearly independent columns. Here, any set of r linearly independent columns of Π is a set of linear combinations of the r columns of η. See also Exercise 14.19. When equation (14.61) holds, we see that Y t−1 Π = Y t−1 η α  . This term is I(0) if and only if the r columns of Y t−1 η are I(0). Thus, for each of the r columns η i of η, Y t−1 η i is I(0). In other words, η i is a cointegrating vector. Since the η i are linearly independent, it follows that there are r independent cointegrating relations. We can now see just how the number of cointegrating vectors is related to the rank of the matrix Π. In the extreme case in which r = 0, there are no cointegrating vectors at all, and Π = O. When r = 1, there is a single cointegrating vector, which is proportional to η 1 . When r = 2, there is a two-dimensional space of cointegrating vectors, spanned by η 1 and η 2 . When r = 3, there is a three-dimensional space of cointegrating vectors, spanned by η 1 , η 2 , and η 3 , and so on. Our assumptions exclude the case with r = g, since we have assumed that all the elements of Y t are I(1). If r = g, every linear combination of these elements would be stationary, which implies that all the elements of Y t are I(0). The system (14.60) with the constraint (14.61) imposed can be written as ∆Y t = X t B + Y t−1 η α  + p  i=1 ∆Y t−i Γ i + U t . (14.62) Estimating this system of equations yields estimates of the r cointegrating vectors. However, it can be seen from (14.62) that not all of the elements of η and α can be identified, since the factorization (14.61) is not unique for a given Π. In fact, if Θ is any nonsingular r × r matrix, η ΘΘ −1 α  = ηα  . (14.63) It is therefore necessary to make some additional assumption in order to con- vert equation (14.62) into an identified model. We now consider the simpler case in which g = 2, r = 1, and p = 0. In this case, the VAR (14.60) becomes ∆y t1 = X t b 1 + π 11 y t−1,1 + π 21 y t−1,2 + u t1 , ∆y t2 = X t b 2 + π 12 y t−1,1 + π 22 y t−1,2 + u t2 , (14.64) in obvious notation. If one forgets for a moment about the terms X t b i , this pair of equations can be deduced from the model (14.36), with π 21 = φ 12 , π 12 = φ 21 , and π ii = φ ii − 1, i = 1, 2. We saw in connection with the system (14.36) that, if y t1 and y t2 are cointegrated, then the matrix Φ of (14.37) has Copyright c  1999, Russell Davidson and James G. MacKinnon 14.5 Cointegration 625 one unit eigenvalue and the other eigenvalue less than 1 in absolute value. This requirement is identical to requiring the matrix  π 11 π 21 π 12 π 22  to have one zero eigenvalue and the other between −2 and 0. Let the zero eigenvalue correspond to the eigenvector [η 2 . . . . 1]. Then it follows that π 21 = −η 2 π 11 and π 22 = −η 2 π 12 . Thus the pair of equations corresponding in this special case to the set of equations (14.62), incorporating an identifying restriction, is ∆y t1 = X t b 1 + π 11 (y t−1,1 − η 2 y t−1,2 ) + u t1 , ∆y t2 = X t b 2 + π 12 (y t−1,1 − η 2 y t−1,2 ) + u t2 , (14.65) from which it is clear that the cointegrating vector is [1 . . . . −η 2 ]. Unlike equations (14.64), the restricted equations (14.65) are nonlinear. There are at least two convenient ways to estimate them. One is first to estimate the unrestricted equations (14.64) and then use the GNR (12.53) discussed in Section 12.3, possibly with continuous updating of the estimate of the contemporaneous covariance matrix. Another is to use maximum likelihood, under the assumption that the error terms u t1 and u t2 are jointly normally distributed. This second method extends straightforwardly to the estimation of the more general restricted VAR (14.62). The normality assumption is not really restrictive, since the ML estimator is a QMLE even when the normality assumption is not satisfied; see Section 10.4. Maximum likelihood estimation of a system of nonlinear equations was treated in Section 12.3. We saw there that one approach is to minimize the deter- minant of the matrix of sums of squares and cross-products of the residuals. The hard work can be restricted to the minimization with respect to η 2 , since, for fixed η 2 , the regression functions in (14.65) are linear with respect to the other parameters. As functions of η 2 , then, the residuals can b e written as M X,v ∆y i , where the y i , for i = 1, 2, are n vectors with typical elements y ti , and v is an n vector with typical element y t−1,1 − η 2 y t−1,2 , for the given η 2 . Here M X,v denotes an orthogonal projection on to S ⊥ ([X v]). For simplicity, we suppose for the moment that X is an empty matrix. The general case will be dealt with in more detail in the next section. Then the determinant that we wish to minimize with respect to η 2 is the determinant of the matrix ∆Y  M v ∆Y, where ∆Y = [∆y 1 ∆y 2 ]. A certain amount of algebra (see Exercise 14.22) shows that this determinant is equal to the determinant of ∆Y  ∆Y times the ratio κ ≡ v  M ∆Y v v  v . (14.66) Copyright c  1999, Russell Davidson and James G. MacKinnon 626 Unit Roots and Cointegration Since ∆Y  ∆Y depends only on the data and not on η 2 , it is enough to minimize κ with respect to η 2 . The notation κ is intended to be reminiscent of the notation used in Section 12.5 in the context of LIML estimation, since the algebra of LIML is very similar to that used here. In the present simple case, the first-order condition for minimizing κ reduces to a quadratic equation for η 2 . Of the two roots of this equation, we select the one for which the value of κ given by equation (14.66) is smaller; see Exercise 14.23 for details. As with the other methods we have discussed, estimating a cointegrating vec- tor by a VAR yields a super-consistent estimator. Bias is in general less than with either the levels estimator (14.46) or the ECM estimator obtained by running regression (14.52). For small sample sizes, there appears to be a ten- dency for there to be outliers in the left-hand tail of the distribution, leading to a higher variance than with the other two methods. This phenomenon apparently disappears for samples of size greater than about 100, however; see Exercise 14.24. 14.6 Testing for Cointegration The three methods discussed in the last section for estimating a cointegrating vector can all be extended to provide tests for whether cointegrating relations exist for a set of I(1) variables, and, in the case in which a VAR is used, to determine how many such relations exist. We begin with a method based on the cointegrating regression (14.44). Engle-Granger Tests The simplest, and probably still the most popular, way to test for cointe- gration was proposed by Engle and Granger (1987). The idea is to estimate the cointegrating regression (14.44) by OLS and then subject the resulting estimates of ν t to a Dickey-Fuller test, which is usually augmented to deal with serial correlation. We saw in the last section that, if the variables Y t are cointegrated, then the OLS estimator of η 2 from equation (14.44) is super- consistent. The residuals ˆν t are then super-consistent estimators of the par- ticular linear combination of the elements of Y t that is I(0). If, however, the variables are not cointegrated, there is no such linear combination, and the residuals, being a linear combination of I(1) variables, are themselves I(1). Therefore, they have a unit root. Thus, when we subject the series ˆν t to a unit root test, the null hypothesis of the test is that ν t does have a unit root, that is, that the variables in Y t are not cointegrated. It may seem curious to have a null hypothesis of no cointegration, but this follows inevitably from the nature of any unit root test. Recall from the simple model (14.36) that, when there is no cointegration, the matrix Φ of (14.37) is restricted so as to have two unit eigenvalues. The alternative hypothesis of cointegration implies that there is just one, the only constraint on the other Copyright c  1999, Russell Davidson and James G. MacKinnon 14.6 Testing for Cointegration 627 eigenvalue being that its absolute value should be less than 1. It is therefore natural from this point of view to have a test with a null hypothesis of no cointegration, with the restriction that there are two unit roots, against an alternative of cointegration, with only one. This feature applies to all the tests for cointegration that we consider. The first step of the Engle-Granger procedure is to obtain the residuals ˆν t from regression (14.44). An augmented Engle-Granger (EG) test is then performed in almost exactly the same way as an augmented Dickey-Fuller test, by running the regression ∆ˆν t = X t γ + β  ˆν t−1 + p  j=1 δ j ∆ˆν t−j + e t , (14.67) where p is chosen to remove any evidence of serial correlation in the residuals. As with the ADF test, the test statistic may be either a τ statistic or a z statistic, although the former is more common. We let τ c (g) denote the t statistic for β  = 0 in (14.67) when X t contains only a constant term and the vector η 2 has g−1 elements to be estimated. Similarly, τ nc (g), τ ct (g), and τ ctt (g) denote t statistics for the same null hypothesis, where the indicated deterministic terms are included in X t . By the same token, z nc (g), z c (g), z ct (g), and z ctt (g) denote the corresponding z statistics. As before, these are defined by equation (14.34). As the above notation suggests, the asymptotic distributions of these test statistics depend on g. When g = 1, we have a limiting case, since there is then only one variable, y t , which is I(1) under the null hypothesis and I(0) under the alternative. Not surprisingly, for g = 1, the asymptotic distribution of each of the Engle-Granger statistics is identical to the asymptotic distribution of the corresponding Dickey-Fuller statistic. To see this, note that the residuals ˆν t are in this case just y t itself projected off whatever is in X t . The result then follows from the FWL Theorem, which implies that regressing y t on X t and then running regression (14.67) is the same (except for the initial observations) as directly running an ADF testing regression like (14.32). If there is more than one variable, but some or all of the components of the cointegrating vector are known, then the proper value of g is 1 plus the number of parameters to be estimated in order to estimate η 2 . Thus, if all the parameters are known, we have g = 1 whatever the number of variables. Figure 14.4 shows the asymptotic densities of the τ c (g) tests for g = 1, . . . , 12. The densities move steadily to the left as g, the number of possibly cointe- grated variables, increases. In consequence, the critical values become larger in absolute value, and the power of the test diminishes. The other Engle- Granger tests display similar patterns. Since a set of g I(1) variables is cointegrated if there is a linear combination of them that is I(0), any g independent linear combinations of the variables Copyright c  1999, Russell Davidson and James G. MacKinnon 628 Unit Roots and Cointegration −8.0 −7.0 −5.0 −4.0 −2.0 −1.0 0.0 1.0 2.0 3.0 4.0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . N(0, 1) τ c (1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . τ c (12) −2.861−6.112 τ f(τ) Figure 14.4 Asymptotic densities of Engle-Granger τ c tests is also a cointegrated set. In other words, cointegration is a property of the linear space spanned by the variables, not of the particular choice of variables that span the space. A problem with Engle-Granger test statistics is that they depend on the particular choice of Y t2 in the first step regression (14.44), or, more precisely, on the linear subspace spanned by the variables in Y t2 . The asymptotic distribution of the test statistic under the null hypothesis is the same regardless of how Y t2 is chosen, but the actual test statistic is not. Consequently, Engle-Granger tests with the same data but different choices of Y t2 can, and often do, lead to quite different inferences. ECM Tests A second way to test for cointegration involves the estimation of an error- correction model. We can base an ECM test for the null hypothesis that the set of variables Y t = [y t Y t2 ] is not cointegrated on equation (14.53). If no linear combination of the variables in Y t is I(0), then the coefficients δ and δ 2 in that equation must be zero. A suitable test statistic is thus the t statistic for δ = 0. Of course, since the regressor y t−1 is I(1), this ECM statistic does not follow the N(0, 1) distribution asymptotically. Instead, if Y t is a 1 × g vector, it follows the distribution that Ericsson and MacKinnon (2002) call the κ d (g) distribution, where d is one of nc, c, ct, or ctt, depending on which deterministic regressors are included in X t . When g = 1, the asymptotic distribution of the ECM statistic is identical to that of the corresponding Dickey-Fuller τ statistic. This follows immediately Copyright c  1999, Russell Davidson and James G. MacKinnon [...]... r2 −n log(1 − λi ) (14.74) i=r1 +1 This is often called the trace statistic, because it can be thought of as the sum of a subset of the elements on the principal diagonal of the diagonal matrix −n log(I − Λ) Because the impact matrix Π cannot be written as a matrix of coefficients of I(0) variables (recall the discussion in the last section), the distributions of the trace statistic are nonstandard These... only of theoretical interest The most convenient test statistics are likelihood ratio (LR) statistics We saw in the last section that a convenient way to obtain ML estimates of the restricted VAR (14.62) is to minimize the determinant of the matrix of sums of squares and cross-products of the residuals We now describe how to do this in general, and how to use the result in order to compute estimates of. .. (Section 10. 5), and the binary response model regression (Section 11.3) We can write any of these artificial regressions as r(θ) = R(θ)b + residuals, (15.01) where θ is a parameter vector of length k, r(θ) is a vector, often but by no means always of length equal to the sample size n, and R(θ) is a matrix with as many rows as r(θ) and k columns For example, in the case of the GNR, r(θ) is a vector of residuals,... a test of the moment condition (15.13) can be based on the following Gauss-Newton regression: ˆ ˆ ˆ ut (β) = Xt (β)b + czt (β) + residual, Copyright c 1999, Russell Davidson and James G MacKinnon (15.20) 648 Testing the Specification of Econometric Models ˆ where β is the vector of NLS estimates of the parameters, and Xt (β) is the k vector of derivatives of xt (β) with respect to the elements of β ˆ... ut (β), respectively The derivative of n−1 z (β)u(β) with respect to any component βi of the vector β is 1 − n ∂z (β) ∂x(β) 1 u(β) − − z (β) n ∂βi ∂βi (15.21) Since the elements of z(β) are predetermined, so are those of its derivative with respect to βi , and since u(β0 ) is just the vector of error terms, it follows from a law of large numbers that the first term of expression (15.21) tends to zero... Section 10. 3, we first encountered the information matrix equality This famous result, which is given in equation (10. 34), tells us that, for a model estimated by maximum likelihood with parameter vector θ, the asymptotic information matrix, I(θ), is equal to minus the asymptotic Hessian, H(θ) The proof of this result, which was given in Exercises 10. 6 and 10. 7, depends on the DGP being a special case of. .. standardized random walk (14.02) Demonstrate that any pair of terms from either sum on the right-hand side of the above expression are uncorrelated Let the fourth moment of the white-noise process εt be m4 n 2 Then show that the variance of t=1 wt is equal to m4 1 n(n + 1)(2n + 1) + − n2 (n2 − 1), 3 6 of order n4 as n → ∞ Hint: Use the results of Exercise 14.3 14.12 Consider the standardized Wiener process... variety of appropriate tests of the hypotheses that the levels of consumption and income have unit roots Repeat the exercise for the logs of these variables Copyright c 1999, Russell Davidson and James G MacKinnon 14.8 Exercises 639 If you fail to reject the hypotheses that the levels or the logs of these variables have unit roots, proceed to test whether they are cointegrated, using two versions of the... 7.7), tests of common factor restrictions (Section 7.9), DWH tests (Section 8.7), tests of overidentifying restrictions (Sections 8.6, 9.4, 9.5, 12.4, and 12.5), and the three classical tests for models estimated by maximum likelihood, notably LM tests (Section 10. 6) In this chapter, we discuss a number of other procedures that are designed for testing the specification of econometric models Some of these... for various values of the rank r of the impact matrix Π, using ML estimation based on the assumption that the error vector Ut is multivariate normal for each t and independent across observations Null hypotheses for which there are any number of cointegrating relations from 0 to g − 1 can then be tested against alternatives with a greater number of relations, up to a maximum of g Of course, if there . Recall that the rank of a matrix is the number of linearly independent columns. Here, any set of r linearly independent columns of Π is a set of linear combinations of the r columns of η. See also. Section 10. 4. Maximum likelihood estimation of a system of nonlinear equations was treated in Section 12.3. We saw there that one approach is to minimize the deter- minant of the matrix of sums of. determinant of the matrix of sums of squares and cross-products of the residuals. We now describe how to do this in general, and how to use the result in order to compute estimates of sets of cointegrating

Ngày đăng: 14/08/2014, 22:21

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

  • Đang cập nhật ...

TÀI LIỆU LIÊN QUAN