2014 applied linear regression

370 750 0
2014 applied linear regression

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Applied Linear Regression Applied Linear Regression Fourth Edition SANFORD WEISBERG School of Statistics University of Minnesota Minneapolis, MN Copyright © 2014 by John Wiley & Sons, Inc All rights reserved Published by John Wiley & Sons, Inc., Hoboken, New Jersey Published simultaneously in Canada No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permissions Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose No warranty may be created or extended by sales representatives or written sales materials The advice and strategies contained herein may not be suitable for your situation You should consult with a professional where appropriate Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002 Wiley also publishes its books in a variety of electronic formats Some content that appears in print may not be available in electronic formats For more information about Wiley products, visit our web site at www.wiley.com Library of Congress Cataloging-in-Publication Data: Weisberg, Sanford, 1947– Applied linear regression / Sanford Weisberg, School of Statistics, University of Minnesota, Minneapolis, MN.—Fourth edition pages cm Includes bibliographical references and index ISBN 978-1-118-38608-8 (hardback) 1.  Regression analysis.  I.  Title QA278.2.W44 2014 519.5′36–dc23 2014026538 Printed in the United States of America 10  9  8  7  6  5  4  3  2  To Carol, Stephanie, and the memory of my parents Contents Preface to the Fourth Edition   Scatterplots and Regression xv 1.1  1.2  1.3  1.4  1.5  Scatterplots,  Mean Functions,  10 Variance Functions,  12 Summary Graph,  12 Tools for Looking at Scatterplots,  13 1.5.1  Size,  14 1.5.2  Transformations,  14 1.5.3  Smoothers for the Mean Function,  14 1.6  Scatterplot Matrices,  15 1.7  Problems,  17   Simple Linear Regression 21 2.1  2.2  2.3  2.4  2.5  2.6  Ordinary Least Squares Estimation,  22 Least Squares Criterion,  24 Estimating the Variance σ 2,  26 Properties of Least Squares Estimates,  27 Estimated Variances,  29 Confidence Intervals and t-Tests,  30 2.6.1  The Intercept,  30 2.6.2  Slope,  31 2.6.3  Prediction,  32 2.6.4  Fitted Values,  33 2.7  The Coefficient of Determination, R2,  35 2.8  The Residuals,  36 2.9  Problems,  38 vii viii contents   Multiple Regression 51 3.1  Adding a Regressor to a Simple Linear Regression   Model,  51 3.1.1  Explaining Variability,  53 3.1.2  Added-Variable Plots,  53 3.2  The Multiple Linear Regression Model,  55 3.3  Predictors and Regressors,  55 3.4  Ordinary Least Squares,  58 3.4.1  Data and Matrix Notation,  60 3.4.2  The Errors e,  61 3.4.3  Ordinary Least Squares Estimators,  61 3.4.4  Properties of the Estimates,  63 3.4.5  Simple Regression in Matrix   Notation,  63 3.4.6  The Coefficient of Determination,  66 3.4.7  Hypotheses Concerning One   Coefficient,  67 3.4.8  t-Tests and Added-Variable Plots,  68 3.5  Predictions, Fitted Values, and Linear   Combinations,  68 3.6  Problems,  69   Interpretation of Main Effects 4.1  Understanding Parameter Estimates,  73 4.1.1  Rate of Change,  74 4.1.2  Signs of Estimates,  75 4.1.3  Interpretation Depends on Other Terms in   the Mean Function,  75 4.1.4  Rank Deficient and Overparameterized Mean   Functions,  78 4.1.5  Collinearity,  79 4.1.6  Regressors in Logarithmic Scale,  81 4.1.7  Response in Logarithmic Scale,  82 4.2  Dropping Regressors,  84 4.2.1  Parameters,  84 4.2.2  Variances,  86 4.3  Experimentation versus Observation,  86 4.3.1  Feedlots,  87 4.4  Sampling from a Normal Population,  89 73 336 Mean functions (continued) regression, 10–12 scaled power transformations, 189–190 simple linear regression, 21–22 least squares estimates, 29 regressor addition, 51–55 smoothers, 14–15 Mean shift outlier model, regression diagnostics, 214–218 Mean square, 26, 134–138 Means comparison analysis of variance, 142 level means, 102–103 Measure, correlate, predict method, 154–155 Missing data, 119–122 missing at random (MAR), 121–122 multiple imputation, 122 Misspecified variance, 162–167 accommodation, 163–164 constant variance test, 164–167 Mixed models, 169–171 Model averaging, 247 Multilevel and hierarchical models, 171 Multiple comparisons, 102, 108 Multiple correlation coefficient See Coefficient of determination Multiple linear regression, 51–69 coefficient of determination (R2) , 66–67, 92–93 collinearity, 79–81 delta method, 173–174 factors, 98–108 model, 55 ordinary least squares, 58–68 overall F-test, 136 predictions, fitted values, and linear combinations, 68–69 regressors, 51–58 residual plots, 210 transformations, 193–196 Multiple testing, 150 Multiplicative error, 187–188 Multistage sample surveys, 161–162 Multivariate normality, 89–91 Natural logarithms See Logarithms Neural networks, 247 Newton–Raphson algorithm, 311 Noncentrality parameter, power and non-null distributions, 143–145 Nonconstant variance regression diagnostics, 213–214 tests for, 164–167 subject index Nonlinear regression, 252–269 bootstrap inference, 262–265 large sample inference, 256–257 literature sources, 265 mean function estimation, 253–256 starting values, 257–262 Non-null distributions, analysis of variance, 143–145 Nonparametric estimation, mean functions, 10–12 Nonpositive variables, transformation, 198–199 Normal distribution multivariate, 89–91 sampling from, 89–91 Normal equations, 293 Normal probability plot, 225–226 Normality Box–Cox transformation to, 191 power transformations to, 195–196 Normality assumption, regression diagnostics, 225–226 Notation aic, 238 anova, 139 bic, 239 case, correlation ρ, 292 covariance, Cov, 291 df, 26 expectation E, 290 gls, 168 hats, 22 hii, 207 NID, 29 ols, 22 p′, 64 predictor, 16 RY2, X , 236 regressor, 16 RSS, 24 rxy, 23 SD, 23 se, 28 SSreg, 35 SXX, 23 sxy, 23 SXY, 23 SYY, 23 typewriter font, variance VAR, 291 wls, 156 x , 23 subject index Null plot characteristics, 14 simple linear regression, 36–38 Observational data, 75 Odds of success, binomial regression, 273–277 Offset, 249 One-dimensional estimation, linearly related regressors, 194–195 One-factor model, one-way anova, 99–102 Ordinary least squares (ols) estimation, 22, 24–26, 58–68 computing formulas, 61 matrix version, 304 misspecified variances, 163–167 nonlinear regression, 258–259 properties, 27–29, 305–307 Orthogonal factors, 141–142 Orthogonal polynomials, 112–113 Orthogonal projection, 206–208 Outliers, 214–218 scatterplots, 4–5, 13 Overall F-test multiple regression, 136 simple regression, 135–136 Overparameterized mean function one-factor models, 100–102 parameter estimates, 78–79 Pairwise comparisons, 102–103 Parameters, 73–93, 95–114 aliased, 78 collinearity, 79–81 F-tests, 138 intercept, 10, 21 multiple regression model, 55 not the same as estimates, 24 partial slope, 73 rank deficient or overparameterized mean functions, 78–79 signs of estimates, 75 simple linear regression, 21–22 slope, 10, 21 variable selection and assessment of, 235–237 Partial R , 236 Partial slope, 73 Pearson residuals, 208 Poisson and binomial regression, 284–285 Pearson’s χ2 , 283 337 Per-test error rate, 150 Poisson distribution, 271–272 generalized linear models, 283–285 variance stabilizing transformations, 171–172 Poisson regression, 270–289 deviance, 277–279 goodness of fit tests, 282–283 Polynomial regressors, 109–113 multiple predictors, 111–112 multiple regression model, 56 numerical issues, 112–113 Power calculators, 144 Power family modified power family, 190–191 scaled power transformations, 188–190 transformations, 186–188 Power of the test, analysis of variance, 143–145 Predicted residual (PRESS residual), 230 Prediction, 32–34 weighted least squares, 159 Predictor variables See also Regressors active vs inactive, 235 complex regressors, 98–122 principal components, 117–119 discovery, 238–245 experimentation vs observation, 86–89 multiple linear regression, 55–58, 68–69 one-factor models, 100–102 polynomial regression, 109–113 scatterplots, 2–5 matrix, 16–17 selection methods, 234–251 single variable transformation, 188–190 transformations, 193–196 automatic selection, 195–196 Principal component analysis complex regressors, 116–119 multiple regression model, predictors and regressors, 57 Probability plot, 225–226 p-value hypothesis testing, 133 interpretation, 146–147 means comparison, 103 outlier tests, 217–218 power and non-null distributions, 144–145 Wald tests, 145–146 338 QR factorization, 228, 307–308 Quadratic regression, 109–113 curvature testing with, 212–213 delta method for a maximum or minimum, 174 R packages alr4, ii, 290 car, 140 effects, 108, 153 lsmeans, 153 nlme, 168 R See Coefficient of determination Random coefficients model, 170–171 Random forests, 247 Random vectors, 303 Range rule, power transformations, 188 Rank deficient mean function, 78–79 Regression coefficients complex models, 98-113 interpretation, 73-91 Regression diagnostics, 204–233 hat matrix, 205 weighted hat matrix, 208 influential cases, 218–225 added-variable plots, 224–225 Cook’s distance, 220–221 nonconstant variance, 213–214 normality assumption, 225–226 outliers, 214–218 level significance, 217–218 methodology, 218 test, 215–216 weighted least squares, 216 Poisson and binomial regression, 284–285 residuals, 204–212 curvature testing, 212–213 error vectors, 205–206 hat matrix, 206–208 plots of, 209–210 weighted hat matrix, 208 Regression through the origin, 93 Regressors, 16, 51, 55–58 class variable, 101 colinear, 79 dropping, 84 dummy variables, 56, 100 effects coding, 125 factors, 98–109 intercept, 56 linearly dependent, 78 linearly related, 194–195 polynomial, 56, 109–113 principal component, 116–119 subject index splines, 113–116 transformed predictors, 56 Regularized methods, 244–245 Reliability of hypothesis testing, 148 Repeated measures, 171 Reported significance levels, 149 Research findings, test interpretation, 147–148 Residual mean square, 26–27 Residual plots, 166, 209–226 Residual sampling, bootstrap analysis, 179 Residual variance, 90–91 Residuals, 23, 25, 35–38, 204–218 Pearson, 208 predicted, 230 standardized, 216 studentized, 216 supernormality, 225–226 weighted, 156 Response variable logarithmic scale, 82–83 scatterplots, 2–5 transformations, 196–198 Sample surveys, 161–162 Sampling weight, 162 Sandwich estimator, 163–167 Scad, 244 Scaled power transformations, 189–190 Box–Cox method, 191 Scatterplot, Scatterplot matrix, 15–17 Score test, nonconstant variance, 166–167 Score vector, 254–256 Second-order mean function analysis of variance, 141–142 polynomial regressors, 111–113 Segmented regression, 263–265 Separated points, scatterplots, 4–5 Sequential analysis of variance (Type I), 140–141 Signs of parameter estimates, 75 Single coefficient hypotheses, 133 multiple linear regression, 68–69 Wald tests, 145–146 Single linear combination, Wald tests, 146 Size, scatterplots, 14 Slices, scatterplots, 4–5 Slope parameter estimates, 73–83 simple linear regression, 21–22 Smoothers loess, 14, 296–298 splines, 113–116 339 subject index Sparcity principle, 244–245 Spectral decomposition, 309 Splines, 113-116 Square-root transformation, variance stabilization, 172 Stacking the deck, hypothesis testing, 149–150 Standard deviation, simple linear regression, 29–30 Standard error of prediction, 33, 68, 159 bootstrap analysis, 176–179 delta method, 172–174 Standard error of regression, 29–30, 61 Starting values, nonlinear regression, 257–262 Statistical error, 21–22 Stepwise regression, 238, 239–245 Stratified random sample, sample surveys, 161–162 Summary graph, 12–14 Sums of squares regression, 35, 63, 134 residual, 22, 24, 63 total, 35 Superpopulation, sample surveys, 162 Symbols, definitions table, 23 Taylor series approximation, 254–256 Test interpretation, 146–150 bootstrap analysis, 179 Poisson and binomial regression, 284 regression diagnostics, outliers, 215–218 Term See Regressors Test statistics, power transformations, automatic predictor selection, 195–196 Third-order mean function, 109 Transformation family, 186–188 Transformations, 56, 185–203 additive models, 199 automatic predictor selection, 195–196 basic power transformation, 186 basic principles, 185–186 Box–Cox method, 190–191, 194–199, 312–314 linearly related regressors, 194–195 log rule, 188 modified power, 190 methodology and examples, 191–196 multivariate, 195 nonpositive variables, 198–199 power transformations, 186–188 range rule, 188 response, 196–198 scaled power, 189, 252 scatterplots, 14 single predictor variable, 188–190 variance stabilization, 171–172 Yeo–Johnson, 198–199 True discovery, hypothesis testing, 147–148 t-Tests misspecified variances, 163–167 multiple linear regression, 68 one-factor models, 102 main effects model, 107–108 Poisson and binomial regression, 284 regression diagnostics, outliers, 217–218 simple linear regression, 30–34 two sample, 44 Tukey’s test for nonadditivity, 212–213 Type II analysis of variance, 140–141 Uncorrected sums of squares, 61–62 Uncorrelated data, scatterplots, 8–9 Unexplained variation multiple linear regression, coefficient of determination (R2), 67–68 simple linear regression, coefficient of determination (R2), 35–36 Univariate summary statistics multiple regression, 57–58 simple linear regression, 23–24 Validation set, variables selection, 247 Variable selection, 234–251 discovery, 237–245 information criteria, 238–239 regularized methods, 244–245 stepwise regression, 239–244 subset selection, 245 parameter assessment, 235–237 Poisson and binomial regression, 285 prediction, model selection for, 245–248 cross-validation, 247 professor ratings, 247–248 Variance estimation bootstrap method, 174–179 nonlinear parameter functions, 178 regression inference, no normality, 175–178 residual bootstrap, 179 delta method, 174 multiple linear regression, 66 nonlinear regression, 253–256 simple linear regression, 26–27 tests, 179 340 Variance inflation factor, 249 Variances general correlation structures, 168–169 misspecified variance, 162–167 accommodation, 163–164 constant variance test, 164–167 mixed models, 169–171 multiple linear regression, 58–59 overview, 156–179 Poisson and binomial regression, 284 scatterplots, 12–14 simple linear regression, 21–22 stabilizing transformations, 171–172 weighted least squares, 156–162 Wald tests, 133, 145–146 likelihood ratio test comparison, 146 single coefficient hypotheses, 145–146 subject index Weighted least squares (wls) constant variance test, 166–167 regression diagnostics outliers, 216 weighted hat matrix, residuals, 208 variances, 156–162 group means weighting, 159–161 sample surveys, 161–162 Wilkinson–Rogers notation, 101, 106–109, 139, 151, 259 binomial regression, 276–277 Working residual, nonlinear mean function estimation, 255 W statistic, regression diagnostics, 226 Yeo–Johnson transformation, nonpositive variables, 198–199 Zipf’s law, 48 ... Applied Linear Regression Applied Linear Regression Fourth Edition SANFORD WEISBERG School of Statistics University of Minnesota Minneapolis, MN Copyright © 2014 by John Wiley... of regression methodology called linear regression This method is the most commonly used in regression, and virtually all other regression methods build upon an understanding of how linear regression. .. Population,  89 73 contents ix 4.5  More on R2,  91 4.5.1  Simple Linear Regression and R2,  91 4.5.2  Multiple Linear Regression and R2,  92 4.5.3  Regression through the Origin,  93 4.6  Problems,  93

Ngày đăng: 09/08/2017, 10:32

Từ khóa liên quan

Mục lục

  • Cover

  • Title page

  • Copyright page

  • Dedication

  • Contents

  • Preface to the Fourth Edition

  • CHAPTER 1: Scatterplots and Regression

    • 1.1 Scatterplots

    • 1.2 Mean Functions

    • 1.3 Variance Functions

    • 1.4 Summary Graph

    • 1.5 Tools for Looking at Scatterplots

      • 1.5.1 Size

      • 1.5.2 Transformations

      • 1.5.3 Smoothers for the Mean Function

      • 1.6 Scatterplot Matrices

      • 1.7 Problems

      • CHAPTER 2: Simple Linear Regression

        • 2.1 Ordinary Least Squares Estimation

        • 2.2 Least Squares Criterion

        • 2.3 Estimating the Variance σ2

        • 2.4 Properties of Least Squares Estimates

        • 2.5 Estimated Variances

Tài liệu cùng người dùng

Tài liệu liên quan