1. Trang chủ
  2. » Giáo Dục - Đào Tạo

Econometrics

387 262 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 387
Dung lượng 2,51 MB

Nội dung

ECONOMETR ICS Bruce E. Hansen c °2000, 2014 1 Universit y of W isconsin Departm en t of Economics This Revision: Jan uary 3, 2014 Commen ts We lcome 1 This man uscript may be printed and reproduced for individual or instructional use, but may not be prin ted for commercial purposes. Conten ts Preface viii 1 Introduction 1 1.1 WhatisEconometrics? 1 1.2 TheProbabilityApproachtoEconometrics 1 1.3 EconometricTermsandNotation 2 1.4 ObservationalData 3 1.5 StandardDataStructures 4 1.6 SourcesforEconomicData 5 1.7 EconometricSoftware 6 1.8 ReadingtheManuscript 7 1.9 CommonSymbols 8 2 Conditional Expectation and Projection 9 2.1 Introduction 9 2.2 TheDistributionofWages 9 2.3 ConditionalExpectation 11 2.4 Log Differences* 13 2.5 ConditionalExpectationFunction 14 2.6 ContinuousVariables 15 2.7 LawofIteratedExpectations 16 2.8 CEFError 18 2.9 Intercept-OnlyModel 19 2.10RegressionVariance 20 2.11BestPredictor 20 2.12ConditionalVariance 21 2.13HomoskedasticityandHeteroskedasticity 23 2.14RegressionDerivative 23 2.15LinearCEF 24 2.16 Linear CEF with Nonlinear Effects 25 2.17LinearCEFwithDummyVariables 26 2.18BestLinearPredictor 28 2.19LinearPredictorErrorVariance 34 2.20 Regression Coefficients 35 2.21 Regression Sub-Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 2.22 CoefficientDecomposition 36 2.23OmittedVariableBias 37 2.24BestLinearApproximation 38 2.25NormalRegression 38 2.26RegressiontotheMean 39 2.27ReverseRegression 40 2.28LimitationsoftheBestLinearPredictor 41 i CONTENTS ii 2.29 Random CoefficientModel 41 2.30 Causal Effects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 2.31 Expectation: Mathematical Details* . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 2.32 Existence and Uniqueness of the Conditional Expectation* . . . . . . . . . . . . . . 49 2.33 Identification* 50 2.34TechnicalProofs* 51 Exercises 55 3 The Algebra of Least Squares 57 3.1 Introduction 57 3.2 RandomSamples 57 3.3 SampleMeans 58 3.4 LeastSquaresEstimator 58 3.5 SolvingforLeastSquareswithOneRegressor 59 3.6 SolvingforLeastSquareswithMultipleRegressors 60 3.7 Illustration 62 3.8 LeastSquaresResiduals 62 3.9 ModelinMatrixNotation 63 3.10ProjectionMatrix 65 3.11 Orthogonal Projection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 3.12EstimationofErrorVariance 67 3.13AnalysisofVariance 68 3.14RegressionComponents 68 3.15ResidualRegression 70 3.16PredictionErrors 71 3.17 InfluentialObservations 72 3.18NormalRegressionModel 74 3.19CPSDataSet 76 3.20Programming 78 3.21TechnicalProofs* 82 Exercises 83 4 Least Squares Regression 86 4.1 Introduction 86 4.2 SampleMean 86 4.3 LinearRegressionModel 87 4.4 MeanofLeast-SquaresEstimator 88 4.5 VarianceofLeastSquaresEstimator 89 4.6 Gauss-MarkovTheorem 91 4.7 Residuals 92 4.8 EstimationofErrorVariance 93 4.9 Mean-SquareForecastError 95 4.10CovarianceMatrixEstimationUnderHomoskedasticity 96 4.11CovarianceMatrixEstimationUnderHeteroskedasticity 97 4.12StandardErrors 100 4.13Computation 101 4.14MeasuresofFit 102 4.15EmpiricalExample 103 4.16Multicollinearity 105 4.17NormalRegressionModel 108 Exercises 110 CONTENTS iii 5 An Introduction to Large Sample Asymptotics 112 5.1 Introduction 112 5.2 AsymptoticLimits 112 5.3 ConvergenceinProbability 114 5.4 WeakLawofLargeNumbers 115 5.5 AlmostSureConvergenceandtheStrongLaw* 116 5.6 Vector-ValuedMoments 117 5.7 ConvergenceinDistribution 118 5.8 HigherMoments 120 5.9 FunctionsofMoments 121 5.10DeltaMethod 123 5.11StochasticOrderSymbols 124 5.12 Uniform Stochastic Bounds* . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 5.13 Semiparametric Efficiency 127 5.14TechnicalProofs* 130 Exercises 134 6 Asymptotic Theory for Least Square s 135 6.1 Introduction 135 6.2 ConsistencyofLeast-SquaresEstimation 136 6.3 AsymptoticNormality 137 6.4 JointDistribution 141 6.5 ConsistencyofErrorVarianceEstimators 144 6.6 HomoskedasticCovarianceMatrixEstimation 144 6.7 HeteroskedasticCovarianceMatrixEstimation 145 6.8 SummaryofCovarianceMatrixNotation 147 6.9 AlternativeCovarianceMatrixEstimators* 147 6.10 Functions of Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148 6.11AsymptoticStandardErrors 151 6.12tstatistic 153 6.13 ConfidenceIntervals 154 6.14RegressionIntervals 155 6.15ForecastIntervals 157 6.16WaldStatistic 158 6.17HomoskedasticWaldStatistic 158 6.18 ConfidenceRegions 159 6.19 Semiparametric EfficiencyintheProjectionModel 160 6.20 Semiparametric EfficiencyintheHomoskedasticRegressionModel* 162 6.21UniformlyConsistentResiduals* 163 6.22AsymptoticLeverage* 164 Exercises 166 7 Restricted Estimation 169 7.1 Introduction 169 7.2 ConstrainedLeastSquares 170 7.3 ExclusionRestriction 171 7.4 MinimumDistance 172 7.5 AsymptoticDistribution 173 7.6 EfficientMinimumDistanceEstimator 174 7.7 ExclusionRestrictionRevisited 175 7.8 VarianceandStandardErrorEstimation 177 7.9 Misspecification 177 CONTENTS iv 7.10NonlinearConstraints 179 7.11InequalityRestrictions 180 7.12ConstrainedMLE 181 7.13TechnicalProofs* 181 Exercises 183 8 Hypothesis Testing 185 8.1 Hypotheses 185 8.2 AcceptanceandRejection 186 8.3 TypeIError 187 8.4 ttests 187 8.5 TypeIIErrorandPower 188 8.6 Statistical Significance 189 8.7 P-Values 190 8.8 t-ratiosandtheAbuseofTesting 192 8.9 WaldTests 193 8.10HomoskedasticWaldTests 194 8.11Criterion-BasedTests 194 8.12MinimumDistanceTests 195 8.13MinimumDistanceTestsUnderHomoskedasticity 196 8.14FTests 197 8.15LikelihoodRatioTest 198 8.16ProblemswithTestsofNonLinearHypotheses 199 8.17MonteCarloSimulation 202 8.18 ConfidenceIntervalsbyTestInversion 204 8.19PowerandTestConsistency 205 8.20AsymptoticLocalPower 207 8.21AsymptoticLocalPower,VectorCase 210 8.22TechnicalProofs* 211 Exercises 213 9 Regression Extensions 215 9.1 NonLinearLeastSquares 215 9.2 GeneralizedLeastSquares 218 9.3 TestingforHeteroskedasticity 221 9.4 TestingforOmittedNonLinearity 221 9.5 LeastAbsoluteDeviations 222 9.6 QuantileRegression 224 Exercises 227 10 The Bootstrap 229 10.1 DefinitionoftheBootstrap 229 10.2TheEmpiricalDistributionFunction 229 10.3NonparametricBootstrap 231 10.4BootstrapEstimationofBiasandVariance 231 10.5PercentileIntervals 232 10.6Percentile-tEqual-TailedInterval 234 10.7SymmetricPercentile-tIntervals 234 10.8AsymptoticExpansions 235 10.9One-SidedTests 237 10.10SymmetricTwo-SidedTests 238 10.11Percentile ConfidenceIntervals 239 CONTENTS v 10.12BootstrapMethodsforRegressionModels 240 Exercises 242 11 NonParametric Regre ssion 243 11.1Introduction 243 11.2BinnedEstimator 243 11.3KernelRegression 245 11.4LocalLinearEstimator 246 11.5NonparametricResidualsandRegressionFit 247 11.6Cross-ValidationBandwidthSelection 249 11.7AsymptoticDistribution 252 11.8ConditionalVarianceEstimation 255 11.9StandardErrors 255 11.10MultipleRegressors 256 12 Series Estim ation 259 12.1ApproximationbySeries 259 12.2Splines 259 12.3PartiallyLinearModel 261 12.4AdditivelySeparableModels 261 12.5UniformApproximations 261 12.6Runge’sPhenomenon 263 12.7ApproximatingRegression 263 12.8ResidualsandRegressionFit 266 12.9 Cross-Validation Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266 12.10ConvergenceinMean-Square 267 12.11UniformConvergence 268 12.12AsymptoticNormality 269 12.13AsymptoticNormalitywithUndersmoothing 270 12.14RegressionEstimation 271 12.15KernelVersusSeriesRegression 272 12.16TechnicalProofs 272 13 Generalized Method of Moments 278 13.1 OveridentifiedLinearModel 278 13.2GMMEstimator 279 13.3DistributionofGMMEstimator 280 13.4 Estimation of the EfficientWeightMatrix 281 13.5GMM:TheGeneralCase 282 13.6 Over-IdentificationTest 282 13.7HypothesisTesting:TheDistanceStatistic 283 13.8ConditionalMomentRestrictions 284 13.9BootstrapGMMInference 285 Exercises 287 14 Empirical Likelihood 289 14.1Non-ParametricLikelihood 289 14.2AsymptoticDistributionofELEstimator 291 14.3OveridentifyingRestrictions 292 14.4Testing 293 14.5NumericalComputation 294 CONTENTS vi 15 Endogeneity 296 15.1InstrumentalVariables 297 15.2ReducedForm 298 15.3 Identification 299 15.4Estimation 299 15.5SpecialCases:IVand2SLS 299 15.6BekkerAsymptotics 301 15.7 IdentificationFailure 302 Exercises 304 16 Univariate Time Series 306 16.1StationarityandErgodicity 306 16.2Autoregressions 308 16.3StationarityofAR(1)Process 309 16.4LagOperator 309 16.5StationarityofAR(k) 310 16.6Estimation 310 16.7AsymptoticDistribution 311 16.8BootstrapforAutoregressions 312 16.9TrendStationarity 312 16.10TestingforOmittedSerialCorrelation 313 16.11Model Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314 16.12AutoregressiveUnitRoots 314 17 Multivariate Time Series 316 17.1VectorAutoregressions(VARs) 316 17.2Estimation 317 17.3 Restricted VARs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317 17.4SingleEquationfromaVAR 317 17.5TestingforOmittedSerialCorrelation 318 17.6SelectionofLagLengthinanVAR 318 17.7GrangerCausality 319 17.8Cointegration 319 17.9CointegratedVARs 320 18 Limited Dependent Variables 322 18.1BinaryChoice 322 18.2CountData 323 18.3CensoredData 324 18.4SampleSelection 325 19 Panel Data 327 19.1 Individual-EffectsModel 327 19.2 Fixed Effects 327 19.3DynamicPanelRegression 329 20 Nonparametric Density Estimation 330 20.1KernelDensityEstimation 330 20.2AsymptoticMSEforKernelEstimates 332 CONTENTS vii A Matrix Algebra 335 A.1 Notation 335 A.2 MatrixAddition 336 A.3 MatrixMultiplication 336 A.4 Trace 337 A.5 RankandInverse 338 A.6 Determinant 339 A.7 Eigenvalues 340 A.8 Positive Definiteness 341 A.9 MatrixCalculus 342 A.10KroneckerProductsandtheVecOperator 342 A.11VectorandMatrixNorms 343 A.12MatrixInequalities 343 B Probability 348 B.1 Foundations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348 B.2 RandomVariables 350 B.3 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350 B.4 GammaFunction 351 B.5 CommonDistributions 352 B.6 MultivariateRandomVariables 354 B.7 Conditional Distributions and Expectation . . . . . . . . . . . . . . . . . . . . . . . . 356 B.8 Transformations 358 B.9 NormalandRelatedDistributions 359 B.10Inequalities 361 B.11MaximumLikelihood 364 C Numerical Optimization 369 C.1 GridSearch 369 C.2 GradientMethods 369 C.3 Derivative-FreeMethods 371 Preface This book is intended to serve as the textbook for a first-year graduate course in econometrics. It can be used as a stand-alone text, or be used as a supplement to another text. Students are assumed to ha ve an understanding of multivariate calculus, probability theory, linear algebra, and mathema tical statistics. A prior course in undergraduate econometrics would be helpful, but not required. Two excellent undergraduate textbooks are Wooldridge (2009) and Stock and Watson (2010). For reference, some of the basic tools of matrix algebra, probability, and statistics are reviewed in the Appendix. Fo r students wishing to deepen their know l edge of matrix algebra in relation to their study of econometrics, I recommend Matrix Algebra by Abadir and Magnus (2005). An excellent introduction to probability and statistics is Statistical Inference by Casella and Berger (2002). For those wanting a deeper foundation in probability, I recommend Ash (1972) or Billingsley (1995). For more advanced statistical theory, I recommend Lehmann and Casella (1998), van der Vaart (1998), Shao (2003), and Lehmann a nd Romano (2005). Fo r further study in econometrics beyond this text, I recommend Da v idson (1994) for asymp- totic theory, Hamilton (1994) for time-series methods, Wooldridge (2002) for panel data and discrete response models, and Li and Racine (2007) for nonparametrics and semiparametric econometrics. Beyond these texts, the Handbook of Econometrics series provides advanced summaries of contem- porary econometric methods and theory. The end-of-chapter exercises are important parts of the text and are meant to help teach students of econometrics. Answers are not provided, and this is inten tional. I would like to thank Ying-Ying Lee for providing research assistance i n preparing some of the empirical examples presented in the text. As this is a manuscript in progress, some parts are quite incomplete, and there are many topics which I pla n to add. In general, the earlier chapters are the most complete while the later chapters need significant work and revision. viii Chapter 1 Introduction 1.1 W hat is Econometrics? The term “econometrics” is believed to have been crafted by Ragnar Frisch (1895-1973) of Norway, one of the three principle founders of the Econometric Society, first editor of the journal Econometrica, and co-winner of the first Nobel Memorial Prize in Economic Sciences in 1969. It is therefore fitting that we turn to Frisch’s own words in the introduction to the first issue of Econometrica to describe the discipline. A word of e xplanation regarding the term econometrics may be in order. Its defini- tion is implied in the statement of the scope of the [Econometric] Society, in Section I of the Constitution, which reads: “The Econometric Society is an international society for the advancement of economic theory in its relation to statistics and mathematics Its main object shall be to promote studies that aim at a unification of the theoretical- quantitative and the empirical-quan tita tive approach to economic problems ” But there are several aspects of the quantitative approach to economics, and no single one of these aspects, taken by itself, should be confounded with econometrics. Thus, econometrics is by no means the same as economic statistics. Nor is it identical with what we call general economic theory, although a considerable portion of this theory has adefininitely quantitative character. Nor should econometrics be taken as synonomous with the application of mathematics to economics. Experience has shown that each of these three view-points, that of statistics, economic theory, and mathematics, is a necessary, but not by itself a sufficient, condition for a real understanding of the quantitative relations in modern economic life. It is the unification of all three that is powerful. And it is this unification that constitutes econom etrics. Ragnar Frisch , Econometrica, (1933), 1, pp. 1-2. This definition remains valid today, although some terms have evolved somewhat in their usage. Today, we would sa y that econometrics is the unified study of economic models, mathematical statistics, and economic data. Within the field of econometrics there are sub-divisions and specializations. Econometric the- ory concerns the development of tools and methods, and the study of the properties of econometric methods. Applied econometrics is a term describing the development of quantitative economic models and the application of econometric methods to these models using economic data. 1.2 The Probabilit y Approach to Econometrics The unifying methodology of modern econometrics was articulated by Trygve Haavelmo (1911- 1999) of Norway, winner of the 1989 Nobel Memorial Prize in Economic Sciences, in his seminal 1 [...]...CHAPTER 1 INTRODUCTION 2 paper “The probability approach in econometrics , Econometrica (1944) Haavelmo argued that quantitative economic models must necessarily be probability models (by which today we would mean stochastic) Deterministic models are blatently... but some features are left unspecified This approach typically leads to estimation methods such as least-squares and the Generalized Method of Moments The semiparametric approach dominates contemporary econometrics, and is the main focus of this textbook Another branch of quantitative structural economics is the calibration approach Similar to the quasi-structural approach, the calibration approach interprets... designed for a specific problem — and not based on a generalizable principle CHAPTER 1 INTRODUCTION 3 Economists typically denote variables by the italicized roman characters y, x, and/or z The convention in econometrics is to use the character y to denote the variable to be explained, while the characters x and z are used to denote the conditioning (explaining) variables Following mathematical convention,... computational speed, at the cost of increased time in programming and debugging As these different packages have distinct advantages, many empirical economists end up using more than one package As a student of econometrics, you will learn at least one of these packages, and probably more than one 1.8 Reading the Manuscript I have endeavored to use a unified notation and nomenclature The development of the material... single summary measure, and thereby facilitate comparisons across groups Because of this simplifying property, conditional means are the primary interest of regression analysis and are a major focus in econometrics Table 2.1 allows us to easily calculate average wage differences between groups For example, we can see that the wage gap between men and women continues after disaggregation by race, as the... between men and women regardless of educational attainment In many cases it is convenient to simplify the notation by writing variables using single characters, typically y, x and/or z It is conventional in econometrics to denote the dependent variable (e.g log(wage)) by the letter y, a conditioning variable (such as gender ) by the letter x, and multiple conditioning variables (such as race, education and... u), which is a function of the argument u The expression E (y | x = u) is the conditional expectation of y, given that we know that the random variable x equals the specific value u However, sometimes in econometrics we take a notational shortcut and use E (y | x) to refer to this function Hopefully, the use of E (y | x) should be apparent from the context 2.6 Continuous Variables In the previous sections, . the labor force. The differences bet ween the groups would be direct measurements of the ef- fects of different levels of education. However, experiments such as this would be widely condemned as. Structures There are three major types of economic data sets: cross-sectional, time-serie s, and panel. They are distinguished by the dependence structure across observations. Cross-sectional data sets. cross-section data. Time-series data are indexed by time. Typical examples include macroeconomic aggregates, prices and interest rates. This type of data is characterized by serial dependence so

Ngày đăng: 09/04/2014, 12:51

Xem thêm

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN