Indirect Least Squares (ILS)

Một phần của tài liệu Topics in advanced econometrics (Trang 258 - 265)

Corollary 1. In the context of Theorem 4, the system as a whole is identi- fied by the exclusion and normalization conditions L*o'vec(A*) = vec(H) ,

4.5 Indirect Least Squares (ILS)

In developing the theory of LIML and LVR estimation we notice that these procedures rely, implicitly, on estimates of the reduced form. In addition, when we examined the identification problem, in Chapter 3, we saw that one characterization of identifiability is intimately related to the question of whether structural parameters can be recovered from reduced form pa- rameters. The method of ILS is a procedure that seeks to obtain estimators of the structural parameters from estimates of reduced form parameters.

Specifically, it poses the question: if an estimate of the reduced form is given, can we obtain estimates of the structural parameters? Can we do so uniquely? What properties can be ascribed to this procedure, as an estimator of the structural form?

For definiteness, let us begin with the first equation. We are given IT=

(X'X)-I X'Y and we are asked if it is possible to estimate the parameters (3°1 , ,.1 of the first structural equation and, if so, what are the properties of the estimators thus obtained? From equation (3.19), in Chapter 3, we see that the structural and reduced form parameters obey

[ II~III~I IIII2I22] [(30~I Bi2]B2 2 = [',10 CI2]C22 . (4.91)

The equations involving the structural parameters of the first equation are

(4.92) For estimation purposes, it is natural to write the analog of Eq. (4.92) as

(4.93) and, subject to a suitable normalization, seek estimators of the structural parameters, say iJ~I' and 1'.1, by solving the system in Eq. (4.93). It is clear that the system is recursive, and that the key to the problem is the solution of fI~I;3~1 = 0, for the estimator iJol' Ifthe first equation is identified, we know that rank(II21) = ml ;the dimension of the matrix is (G - Gd x (ml+1) ,with G - G1~ ml so that, at first glance, it is not clear how our objective is to be accomplished. Thus, while the identification condition regarding the rank of II~1 is useful, it is not totally determinative in the present context. Let us examine the matter in some detail. Using the results from Proposition 32, Dhrymes (1984), we obtain,

(4.94) If ;3~1 is the true parameter vector, the identification (rank) condition implies that (the only nonnull) vectors, p, satisfying II~IP= 0 are vectors of the form p = c;3~1 ,with c-I- O. The issue with ILS estimation, however, is whether there are any nonnull vectors p such that

In view of Eq. (4.94) we may rewrite Eq. (4.95) as

II~IP +(X;' NIX;)-1x*'N1V1°p=O.

(4.95)

(4.96) The representation in Eq. (4.96) shows that, except for the special case of just identification, we cannot confidently assert the existence of a non- null solution of Eq. (4.95). What is the problem here and why does this not arise with respect to Eq. (4.92)7 The answer is rather simple, indeed.

In Eq. (4.92), we know by assumption that we are dealing with a set of equations which is consistent, in the sense of Ch. 3, Dhrymes (1984); this ensures that there exists a unique vector, viz. the true pa- rameter vector ;3°1, that satisfies all of the equations of that set. We do not have a similar assurance, however, in the case of the system in Eq.

(4.95). Indeed, typically, we would not expect to find a single vector that satisfies all equations therein. Recall that there are G - G1 ~ ml equa- tions in that set; after normalization is imposed there are only ml free elements in the vector p. Unless it is a consistent set of equations, there

is no single vector that can satisfy all these equations. At first, therefore, it might appear that no IL8 estimators exist, except in the case of just identification. However, there are certain variations of the approach above, which we may wish to explore. Thus, after normalization, we could find a nonsingular submatrix of order ml and, ignoring the remaining G - G1 - ml equations in Eq. (4.95), obtain a solution. Ifthe equation is identified, we are assured that there is at least one, but there may well be as many as (G - Gr)!/ml!(G - G1 - ml)! such matrices. Hence, with this approach, there is at least one, and perhaps many more IL8 estima- tors for the parameters of the first equation. Evidently, we need not discuss underidentified equations, since it is meaningless to ask for estimators of structural parameters in such a context. Consequently, in the ensuing dis- cussion we shall deal with a system all of whose equations are identified; we shall also conduct our discussion for the general case of the it h equation, using the selection matrix apparatus developed in Chapter 1.

The relation between reduced form and structural parameters, in the it h equation, is given by IIb~i = c.; .Since

it is clear that, under the standard normalization, we can rewrite Eq. (4.92) as

(4.98) We may render the representation more compact, and actually simplify the discussion, if we use the S-notation of earlier chapters; thus, rewrite Eq.

(4.98) as

(4.99)

(4.100) and recall from Chapter 3, that for an identified equation S, is of full column rank.

Thus, if we are dealing with the true parameters of the model, there exists a unique (the true parameter) vector 8.i , that satisfies the equation above. Indeed, this is so by construction! The difficulty with IL8 is that we are not operating with the system in Eq. (4.99). Instead, we replace Eq.

(4.99) by

in which the quantities IIi and tt.; of Eq. (4.99) have been replaced by their OL8 estimators, n.. ir.i and we seek an estimator, i.e. a vector, say

8.i ,satisfying Eq. (4.100). It is this quantity that is traditionally called the IL8 estimator. Ifthe equation in question isjust identified, evidently s.

is a nonsingular matrix; hence, there exists a unique solution, viz.

(4.101)

which is defined to be the (unique)ILSestimator of 8.i .Itis easily verified that the estimator in Eq. (4.101) also represents the 2SLS estimator of a just identified equation. Thus, in the case of just identification, 2SLS and ILS estimators coincide.

What if the equation in question is overidentified? Well, in such a case the G x (mi +Gi) matrix, Si, has the property

For Eq. (4.100) to have a solution, it must be a consistent system of equa- tions, in the sense of Ch. 3 of Dhrymes (1984). More specifically, fr'i must lie in the column space of s.. Defining Sf = (II~, L2i ) , we see that the requirement above implies the existence of a nonnull vector, c, such that

SfC= 0 , or alternatively,

(4.102) We shall now show that, in general, this is satisfied only with c= O. Let c= (c\, C'2)' ,and rewrite the condition above as

(4.103) But the right member of the equation above is a fixed constant; hence, the left member must also be a fixed constant; this, however, is not possible since Vt. has a nondegenerate distribution and the matrix multiplying ViD

is of rank Gk m;+1) . On the other hand, for the choice

Cãl =(3~,

i.e. for the true parameter vectors, the condition above reads

Clearly, this equation is not satisfied for any finite sample size; however, as (the sample size) T ----+ 00 we have

(X'X)-lT X'U.iT ~0'

by the standard properties of the GLSEM. What the preceding discussion shows is that, for an overidentified equation, Eq. (4.100) is not a consistent system of equations, for finite T. In particular, this means that no vector 8.i exists that satisfies all equations of Eq. (4.100). Thus, strictly speaking in terms of the traditional definition, ILSestimators for the parameters of an overidentified equation do not exist, in the sense of vectors 8.i that satisfy all equations of Eq. (4.100). The fact that, asymptotically, these equations admit of a unique solution means that we could eliminate the

"excess" G - G, - mi equations in Eq. (4.100) and solve the resulting abbreviated system. The estimates thus obtained are, evidently, consistent.

But the choice of equations to be eliminated is arbitrary, and this procedure has nothing to recommend it in empirical applications. This is particularly so, since with small samples we have no criteria by which to rank the prospective candidates. We are thus led to

Definition 1. In the context of the GLSEM, consider the estimator of the reduced form n= (X'X)-1X' Y .The ILS estimator of the structural parameters of the it h equation is defined to be the solution of the problem

min(1r.i -s., fMj.d(1T.i - SiD.i), s, = (ni ,L 2i), and is given by

8'i = (Si)g1T.i, where (Si)9 is the g-inverse of s..6

(4.104)

Remark 6. Because, for an identified equation, Si is of full rank we can write explicitly,

(Si)9= (S~Si)-lS~, When Si is nonsingular

(4.105)

, '-1

(Si)g= s; ,

and when the equation in question is just identified, the definition of the ILS estimator in terms of Eq. (4.104) corresponds, exactly, to the usual definition. The advantage of the definition in Eq. (4.104), is that it pro- vides a formal definition for the ILS estimator which is always appropriate, provided (all) the equations of the GLSEM are identified.

It is easy to establish the consistency of the estimators defined in Eq.

(4.104). To do so we note that since plimSi = Si, S, = (IIi, L 2i ),

T-->oo

plim1T.i =1T.i, i =1,2,3, ... , m,

T-->oo

the probability limit ofthe estimators in Eq. (4.104) is well defined.7 Thus, plim 8.i= s; = (S~Si)-lS~1ri (4.106)

T-->oo

Now, the true parameter vector, D.i satisfies Eq. (4.99); hence, substituting above we find

i=I,2, ...,m, (4.107)

6For a definition of the 9-inverse and related topics see Ch. 3 of Dhrymes (1984).

7Itis interesting that the ILS can be derived in a Bayesian context as well.

See Zellner (1978), Zellner et al.(1988).

(4.108) which concludes the consistency argument.

To establish the limiting distribution of such estimators, we observe that, using Eqs. (4.105) and (4.106), we find

vT(6'i-15.i ) = vT [(8~8i)-18~- (S~Si)-lS~] Jr.i

(8 '8 )- 18 ' (X' X)-1 X'V'i

+ " ' T VT .

Consider the matrix in brackets, in the right member ofEq. (4.109). Adding and subtracting (8~8i)-lS~,we find

(8~8i)-18~ - (S~Si)-lS~ = (8~8i)-1[(8i - Si)' - {8~(8i - Si)

+(8i - Si)'Si}(S~Si)-lSn. (4.109) Consequently,

(4.110) Thus, Eq. (4.108) can be rewritten as

7(6 ") (SA'SA.)-lSA,

(X'X)-l X'Vi°{3o

Y 1 -i - o.; = " ' T VT"

(8' 8 )- 18' (X'X)-l X'V" ' T VTb*

-z : (4.111) where, evidently, b~i is the it h column of B* , subject to the standard normalization. Hence, the ILS estimator for the system as a whole obeys,

But,

(I ® X'V)VT vec(B*) VTvec1 ( 'X VB*) (4.113)

T

1 ( ' ) 1" ')'

;;:;:; vee XU = ;;:;:;L...J(I®Xt. U t.,

yT yTt=l

and Eq. (4.113) can be written, in the more convenient form,

But this represents a variant of the standard limiting distribution problem we have encountered in all simultaneous equations contexts, and which was dealt with adequately in Ch. 2. Consequently, we conclude

Vf(i5 - 8)rv N(O, eIL S ) ,

where

eIL S = (8'8)-18'('L;@M;;x1)8(8'8)-1.

We have therefore proved

Theorem 2. Consider the standard GLSEM

Yt.B* = xtãe+Ut., t = 1,2,3, ... ,T,

(4.114)

(4.115)

as in Theorem 1. Let IT = (X' X)-l X'Y be the OLS estimator of the reduced form, and let b.ibe the indirect least squares (lLS) estimator for the parameters of the it h structural equation, according to Definition 1.

Then,

1. the lLS estimator for the parameters of the it h structural equation can be interpreted as minimizing

ii. the lLS estimator is consistent;

iii. if, by "15 we denote the lLS estimator for the entire system, then

and 8 = diag(81, 82 , ... ,8m ) .

Corollary 1. Ifthe it h equation is just identified then 2SLS and lLS estimators are asymptotically equivalent.

Proof: The it h diagonal block of the covariance matrix in iii above is

which coincides with the corresponding block of the covariance matrix of the systemwide 2SLS estimator.

Corollary 2. If the it h equation is just identified then ILS and 2SLS estimators are numerically identical.

Proof: The ILS estimator is given, in this case, by

which is exactly the 2SLS estimator.

Một phần của tài liệu Topics in advanced econometrics (Trang 258 - 265)

Tải bản đầy đủ (PDF)

(424 trang)