A SECOND COURSE IN STATISTICS REGRESSION ANALYISIS Seventh Edition William Mendenhall University of Florida Terry Sincich University of South Florida Prentice Hall Boston Columbus Indianapolis New York San Francisco Upper Saddle River Amsterdam Cape Town Dubai London Madrid Toronto Delhi Milan Mexico Munich City Sao Paris Paulo Hong Kong Seoul Singapore Taipei Tokyo Montreal Sydney Editor in Chief: Deirdre Lynch Acquisitions Editor: Marianne Stepanian Associate Content Editor: Dana Jones Bettez Senior Managing Editor: Karen Wernholm Associate Managing Editor: Tamela Ambush Senior Production Project Manager: Peggy McMahon Senior Design Supervisor: Andrea Nix Cover Design: Christina Gleason Interior Design: Tamara Newnam Marketing Manager: Alex Gay Marketing Assistant: Kathleen DeChavez Associate Media Producer: Jean Choe Senior Author Support/Technology Specialist: Joe Vetere Manufacturing Manager: Evelyn Beaton Senior Manufacturing Buyer: Carol Melville Production Coordination, Technical Illustrations, and Composition: Laserwords Maine Cover Photo Credit: Abstract green flow, ©Oriontrail/Shutterstock Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks Where those designations appear in this book, and Pearson was aware of a trademark claim, the designations have been printed in initial caps or all caps Library of Congress Cataloging-in-Publication Data Mendenhall, William A second course in statistics : regression analysis/ William Mendenhall, Terry Sincich –7th ed p cm Includes index ISBN 0-321-69169-5 Commercial statistics Statistics Regression analysis I Sincich, Terry, II Title HF1017.M46 2012 519.5 36–dc22 2010000433 Copyright © 2012, 2003, 1996 by Pearson Education, Inc All rights reserved No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher Printed in the United States of America For information on obtaining permission for use of material in this work, please submit a written request to Pearson Education, Inc., Rights and Contracts Department, 501 Boylston Street, Suite 900, Boston, MA 02116, fax your request to 617-671-3447, or e-mail at http://www.pearsoned.com/legal/permissions.htm 10—EB—14 13 12 11 10 ISBN-10: 0-321-69169-5 ISBN-13: 978-0-321-69169-9 Contents Preface ix A Review of Basic Concepts (Optional) 1.1 Statistics and Data 1.2 Populations, Samples, and Random Sampling 1.3 Describing Qualitative Data 1.4 Describing Quantitative Data Graphically 1.5 Describing Quantitative Data Numerically 1.6 The Normal Probability Distribution 1.7 Sampling Distributions and the Central Limit Theorem 1.8 Estimating a Population Mean 1.9 Testing a Hypothesis About a Population Mean 1.10 Inferences About the Difference Between Two Population Means 1.11 Comparing Two Population Variances 12 19 25 29 33 43 64 Introduction to Regression Analysis 2.1 Modeling a Response 2.2 Overview of Regression Analysis 2.3 Regression Applications 2.4 Collecting the Data for Regression 51 80 80 82 84 Simple Linear Regression 87 90 3.1 Introduction 3.2 The Straight-Line Probabilistic Model 3.3 Fitting the Model: The Method of Least Squares 3.4 Model Assumptions 3.5 An Estimator of σ 3.6 Assessing the Utility of the Model: Making Inferences About the Slope β1 109 3.7 The Coefficient of Correlation 3.8 The Coefficient of Determination 3.9 Using the Model for Estimation and Prediction 90 91 93 104 105 116 121 128 iii iv Contents 3.10 A Complete Example 3.11 Regression Through the Origin (Optional) 135 141 CASE STUDY Legal Advertising—Does It Pay? Multiple Regression Models 159 166 4.1 General Form of a Multiple Regression Model 4.2 Model Assumptions 4.3 A First-Order Model with Quantitative Predictors 4.4 Fitting the Model: The Method of Least Squares 4.5 Estimation of σ , the Variance of ε 4.6 Testing the Utility of a Model: The Analysis of Variance F -Test 4.7 Inferences About the Individual β Parameters 4.8 Multiple Coefficients of Determination: R2 and R2a 4.9 Using the Model for Estimation and Prediction 4.10 An Interaction Model with Quantitative Predictors 4.11 A Quadratic (Second-Order) Model with a Quantitative Predictor 4.12 More Complex Multiple Regression Models (Optional) 4.13 A Test for Comparing Nested Models 4.14 A Complete Example 166 168 169 170 173 175 178 181 190 195 201 209 227 235 CASE STUDY Modeling the Sale Prices of Residential Properties in Four Neighborhoods Principles of Model Building 248 261 5.1 Introduction: Why Model Building Is Important 5.2 The Two Types of Independent Variables: Quantitative and Qualitative 263 5.3 Models with a Single Quantitative Independent Variable 5.4 First-Order Models with Two or More Quantitative Independent Variables 272 5.5 Second-Order Models with Two or More Quantitative Independent Variables 274 5.6 Coding Quantitative Independent Variables (Optional) 261 265 281 Contents 5.7 Models with One Qualitative Independent Variable 5.8 Models with Two Qualitative Independent Variables 5.9 Models with Three or More Qualitative Independent Variables 5.10 Models with Both Quantitative and Qualitative Independent Variables 306 5.11 External Model Validation (Optional) 288 292 303 315 Variable Screening Methods 326 6.1 Introduction: Why Use a Variable-Screening Method? 6.2 Stepwise Regression 6.3 All-Possible-Regressions Selection Procedure 6.4 Caveats 326 327 332 337 CASE STUDY Deregulation of the Intrastate Trucking Industry 345 Some Regression Pitfalls 355 7.1 Introduction 7.2 Observational Data versus Designed Experiments 7.3 Parameter Estimability and Interpretation 7.4 Multicollinearity 7.5 Extrapolation: Predicting Outside the Experimental Region 7.6 Variable Transformations 355 355 358 363 Residual Analysis 369 371 383 8.1 Introduction 8.2 Regression Residuals 384 8.3 Detecting Lack of Fit 388 8.4 Detecting Unequal Variances 8.5 Checking the Normality Assumption 8.6 Detecting Outliers and Identifying Influential Observations 8.7 Detecting Residual Correlation: The Durbin–Watson Test 383 398 409 CASE STUDY An Analysis of Rain Levels in California 438 412 424 v vi Contents CASE STUDY An Investigation of Factors Affecting the Sale Price of Condominium Units Sold at Public Auction 447 Special Topics in Regression (Optional) 9.1 Introduction 9.2 Piecewise Linear Regression 9.3 Inverse Prediction 9.4 Weighted Least Squares 9.5 Modeling Qualitative Dependent Variables 9.6 Logistic Regression 9.7 Ridge Regression 9.8 Robust Regression 9.9 Nonparametric Regression Models 10 466 466 466 476 484 491 494 506 510 513 Introduction to Time Series Modeling and Forecasting 519 10.1 What Is a Time Series? 10.2 Time Series Components 10.3 Forecasting Using Smoothing Techniques (Optional) 10.4 Forecasting: The Regression Approach 10.5 Autocorrelation and Autoregressive Error Models 10.6 Other Models for Autocorrelated Errors (Optional) 10.7 Constructing Time Series Models 10.8 Fitting Time Series Models with Autoregressive Errors 10.9 Forecasting with Time Series Autoregressive Models 519 520 522 537 544 547 548 10.10 Seasonal Time Series Models: An Example 553 559 565 10.11 Forecasting Using Lagged Values of the Dependent Variable (Optional) 568 CASE STUDY Modeling Daily Peak Electricity Demands 11 574 Principles of Experimental Design 11.1 Introduction 11.2 Experimental Design Terminology 586 586 586 Contents 12 11.3 Controlling the Information in an Experiment 11.4 Noise-Reducing Designs 11.5 Volume-Increasing Designs 11.6 Selecting the Sample Size 11.7 The Importance of Randomization vii 589 590 597 603 605 The Analysis of Variance for Designed Experiments 608 12.1 Introduction 12.2 The Logic Behind an Analysis of Variance 12.3 One-Factor Completely Randomized Designs 12.4 Randomized Block Designs 12.5 Two-Factor Factorial Experiments 12.6 More Complex Factorial Designs (Optional) 12.7 Follow-Up Analysis: Tukey’s Multiple Comparisons of Means 12.8 Other Multiple Comparisons Methods (Optional) 12.9 Checking ANOVA Assumptions 608 609 610 626 641 663 671 683 692 CASE STUDY Reluctance to Transmit Bad News: The MUM Effect 714 Appendix A Derivation of the Least Squares Appendix B Estimates of β0 and β1 in Simple Linear Regression 720 The Mechanics of a Multiple Regression Analysis 722 B.1 Introduction B.2 Matrices and Matrix Multiplication B.3 Identity Matrices and Matrix Inversion B.4 Solving Systems of Simultaneous Linear Equations B.5 The Least Squares Equations and Their Solutions 722 s2 723 727 730 732 B.6 Calculating SSE and B.7 Standard Errors of Estimators, Test Statistics, and Confidence Intervals for β0 , β1 , , βk 738 737 viii Contents B.8 A Confidence Interval for a Linear Function of the β Parameters; a Confidence Interval for E(y) 741 B.9 A Prediction Interval for Some Value of y to Be Observed in the Future 746 Appendix C Appendix D Useful Statistical Tables A Procedure for Inverting a Matrix 751 756 Table D.1 Normal Curve Areas Table D.2 Critical Values for Student’s t Table D.3 Critical Values for the F Statistic: F.10 759 Table D.4 Critical Values for the F Statistic: F.05 761 Table D.5 Critical Values for the F Statistic: F.025 Table D.6 Critical Values for the F Statistic: F.01 Table D.7 Random Numbers Table D.8 Critical Values for the Durbin–Watson d Statistic (α = 05) 770 Table D.9 Critical Values for the Durbin–Watson d Statistic (α = 01) 771 757 758 763 765 767 Table D.10 Critical Values for the χ Statistic 772 Table D.11 Percentage Points of the Studentized Range, q(p, v), Upper 5% 774 Table D.12 Percentage Points of the Studentized Range, q(p, v), Upper 1% 776 Appendix E File Layouts for Case Study Data Sets 778 Answers to Selected Odd-Numbered Exercises Index 791 Technology Tutorials (on CD) 781 Preface Overview This text is designed for two types of statistics courses The early chapters, combined with a selection of the case studies, are designed for use in the second half of a two-semester (two-quarter) introductory statistics sequence for undergraduates with statistics or nonstatistics majors Or, the text can be used for a course in applied regression analysis for masters or PhD students in other fields At first glance, these two uses for the text may seem inconsistent How could a text be appropriate for both undergraduate and graduate students? The answer lies in the content In contrast to a course in statistical theory, the level of mathematical knowledge required for an applied regression analysis course is minimal Consequently, the difficulty encountered in learning the mechanics is much the same for both undergraduate and graduate students The challenge is in the application: diagnosing practical problems, deciding on the appropriate linear model for a given situation, and knowing which inferential technique will answer the researcher’s practical question This takes experience, and it explains why a student with a nonstatistics major can take an undergraduate course in applied regression analysis and still benefit from covering the same ground in a graduate course Introductory Statistics Course It is difficult to identify the amount of material that should be included in the second semester of a two-semester sequence in introductory statistics Optionally, a few lectures should be devoted to Chapter (A Review of Basic Concepts) to make certain that all students possess a common background knowledge of the basic concepts covered in a first-semester (first-quarter) course Chapter (Introduction to Regression Analysis), Chapter (Simple Linear Regression), Chapter (Multiple Regression Models), Chapter (Principles of Model Building), Chapter (Variable Screening Methods), Chapter (Some Regression Pitfalls), and Chapter (Residual Analysis) provide the core for an applied regression analysis course These chapters could be supplemented by the addition of Chapter 10 (Time Series Modeling and Forecasting), Chapter 11 (Principles of Experimental Design), and Chapter 12 (The Analysis of Variance for Designed Experiments) Applied Regression for Graduates In our opinion, the quality of an applied graduate course is not measured by the number of topics covered or the amount of material memorized by the students The measure is how well they can apply the techniques covered in the course to the solution of real problems encountered in their field of study Consequently, we advocate moving on to new topics only after the students have demonstrated ability (through testing) to apply the techniques under discussion In-class consulting sessions, where a case study is presented and the students have the opportunity to ix Answers to Selected Exercises the model (c) reject H0 : β1 = β2 = β3 = (d) reject H0 : β1 = (e) fail to reject H0 : β2 = (f) fail to reject H0 : β3 = 4.5 (a) yˆ = 3.70 + 34x1 + 49x2 + 72x3 + 1.14x4 + 1.51x5 + 26x6 − 14x7 − 10x8 − 10x9 (b) t = −1.00, fail to reject H0 (c) 1.51 ± 098 4.7 (a) for every 1-unit increase in proportion of block with low density (x1 ), population density will increase by 2; for every 1-unit increase in proportion of block with high density (x2 ), population density will increase by (b) 68.6% of the sample variation in population density is explained by the model (c) H0 : β1 = β2 = (d) F = 133.27 (e) reject H0 4.9 (a) yˆ = 1.81231 + 10875x1 + 00017x2 (b) for every 1-mile increase in road length (x1 ), number of crashes increase by 109; for every one-vehicle increase in AADT (x2 ), number of crashes increase by 00017 (c) 109 ± 082 (d) 00017 ± 00008 (e) yˆ = 1.20785 + 06343x1 + 00056x2 ; for every 1-mile increase in road length (x1 ), number of crashes increase by 063; for every one-vehicle increase in AADT (x2 ), number of crashes increase by 00056; 063 ± 046; 00056 ± 00031 4.11 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 (b) yˆ = 21,087.95 + 108.45x1 + 557.91x2 − 340.17x3 + 85.68x4 (d) t = 1.22, p-value = 238, fail to reject H0 (e) R = 912, Ra2 = 894; Ra2 (f) F = 51.72, reject H0 4.13 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 (b) yˆ = 13,614 + 09x1 − 9.20x2 + 14.40x3 + 35x4 − 85x5 (d) 458.8 ; ≈ 95% of sample heat rates fall within 917.6 kJ/kwhr of model predicted values (e) 917; 91.7% of the sample variation in heat rate is explained by the model (f) yes, F = 147.3, p-value ≈ 4.15 (a) F = 1.056; not reject H0 (b) 05; 5% of the sample variation in IQ is explained by the model 4.17 (a) 36.2% of the sample variation in active caring score is explained by the model (b) F = 5.11, reject H0 4.21 (a) (2.68, 5.82) (b) (−3.04, 11.54) 4.23 95% PI for y when x1 = 23.755, x2 = 90.662, x3 = 25.0: (24.03, 440.64) 4.25 (−1.233, 1.038) 4.27 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (b) linear relationship between number of defects and turntable speed depends on blade position (c) β3 < 4.29 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (b) linear relationship between negative feelings score and number ahead in line depends on number behind in line (c) fail to reject H0 : β3 = (d) β1 > 0; β2 < 4.31 (a) E(y) = β0 +β1 x1 + β2 x2 + β3 x3 + β4 x1 x3 + β5 x2 x3 (b) yˆ = 10,845 − 1280.0x1 + 217.4x2 − 1549.2x3 − 11.0x1 x3 + 19.98x2 x3 (c) t = −.93, p-value = 355, fail to reject H0 (d) t = 1.78, p-value = 076, fail to reject H0 (e) no interaction 4.33 (a) slope depends on x2 and x5 (b) F = 5.505, R = 6792, s = 505; yes 4.35 (a) E(y) = β0 + β1 x1 + β2 x12 (b) β2 (c) negative 783 4.37 (a) yes (b) t = 2.69, p-value = 031, reject H0 4.39 (a) E(y) = β0 + β1 x + β2 x (b) positive (c) no; E(y) = β0 + β1 x1 4.41 (a) curvilinear trend (b) yˆ = 1.01 − 1.17x + 29x (c) yes; t = 7.36, p-value ≈ 0, reject H0 4.43 (a) E(y) = β0 + β1 x1 + β2 x2 (b) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 4.45 (a) E(y) = β0 + β1 x1 , where x = {1 if A, if B} (b) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 , where x1 = {1 if A, if not}, x2 = {1 if B, if not}, x3 = {1 if C, if not}; β0 = μD , β1 = μA − μD , β2 = μB − μD , β3 = μC − μD 4.47 (c) Parallel lines 4.49 (b) second-order (c) different shapes (d) yes (e) shift curves along the x1 -axis 4.51 (a) method: x1 = {1 if manual, if automated}; soil: x2 ={1 if clay, if not}, x3 = {1 if gravel, if not}; slope: x4 ={1 if east, if not}, x5 = {1 if south, if not}, x6 = {1 if west, if not}, x7 = {1 if southeast, if not} (b) E(y) = β0 + β1 x1 ; β0 = μAuto , β1 = μManual − μAuto (c) E(y) = β0 +β1 x2 + β2 x3 ; β0 = μSand , β1 = μClay − μSand , β2 = μGravel − μSand (d) E(y) = β0 + β1 x4 + β2 x5 + β3 x6 + β4 x7 ; β0 = μSW , β1 = μE − μSW , β2 = μS − μSW , β3 = μW − μSW , β4 = μSE − μSW 4.53 (a) E(y) = β0 + β1 x1 + β2 x2 , where x1 = {1 if Full solution, if not}, x2 = {1 if Check figures, if not} (b) β1 (c) yˆ = 2.433 − 483x1 + 287x2 (c) F = 45, p-value = 637, fail to reject H0 4.55 (a) reject H0 : β1 = β2 = β12 = 0; sufficient evidence to indicate the model is adequate for predicting log of card price (b) fail to reject H0 : β1 = (c) reject H0 : β3 = (d) E[ln(y)] = β0 + β1 x4 + β2 x5 + β3 x6 + + β9 x12 + β10 x4 x5 + β11 x4 x6 + + β17 x4 x12 4.57 (a) Model 1: t = 2.58, reject H0 ; Model 2: reject H0 : β1 = (t = 3.32), reject H0 : β2 = (t = 6.47), reject H0 : β3 = (t = −4.77), not reject H0 : β4 = (t = 24); Model 3: reject H0 : β1 = (t = 3.21), reject H0 : β2 = (t = 5.24), reject H0 : β3 = (t = −4.00), not reject H0 : β4 = (t = 2.28), not reject H0 : β5 = (t = 014) (c) Model 4.59 (a) yˆ = 80.22 + 156.5x1 − 42.3x12 + 272.84x2 + 760.1x1 x2 + 47.0x12 x2 (b) yes, F = 417.05, p-value ≈ (c) no; fail to reject H0 : β2 = and H0 : β5 = 4.61 Nested models: a and b, a and d, a and e, b and c, b and d, b and e, c and e, d and e 4.63 (a) 10.1% (55.5%) of the sample variation in aggression score is explained by Model (Model 2) (b) H0 : β5 = β6 = β7 = β8 = (c) yes (d) reject H0 (e) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 + β6 x6 + β7 x7 + β8 x8 + β9 x5 x6 + β10 x5 x7 + β11 x5 x8 + β12 x6 x7 + β13 x6 x8 + β14 x7 x8 (f) fail to reject H0 : β9 = β10 = = β14 = 4.65 (a) E(y) = β0 + β1 x1 + β2 x2 + + β11 x11 (b) E(y) = β0 + β1 x1 + β2 x2 + + β11 x11 + β12 x1 x9 + β13 x1 x10 + β14 x1 x11 + β15 x2 x9 + β16 x2 x10 + β17 x2 x11 + + β35 x8 x11 (c) Test H0 : β12 = β13 = β14 = = β35 = 784 Answers to Selected Exercises 4.67 (a) E(y) = β0 + β1 x1 + β2 x2 + + β10 x10 (b) H0 : β3 = β4 = = β10 = (c) reject H0 (e) 14 ± 5.88 (f) yes (g) E(y) = β0 + β1 x1 + β2 x2 + + β10 x10 + β11 x2 x1 + β12 x2 x3 + + β19 x2 x10 (h) H0 : β11 = β12 = = β19 = 0; partial (nested model) F -test 4.69 (a) multiple t-tests result in an increased Type I error rate (b) H0 : β2 = β5 = (c) fail to reject H0 4.71 (a) yes; t = 5.96, reject H0 (b) t = 01, not reject H0 (c) t = 1.91, reject H0 4.73 (a) estimate of β1 (b) prediction equation less reliable for values of x’s outside the range of the sample data 4.75 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 (b) F = 34.47, reject H0 (c) E(y) = β0 + β1 x1 + β2 x2 + (d) 60.3% of the sample variation in GSI + β7 x7 is explained by the model (e) reject H0 for both 4.77 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (b) p-value = 02, reject H0 (c) impact of intensity on test score depends on treatment and is measured by (β2 + β3 x1 ), not by β2 alone 4.79 (a) Negative (b) F = 1.60, fail to reject H0 (c) F = 1.61, fail to reject H0 4.81 yes; t = 4.20, p-value = 004 4.83 (a) E(y) = β0 + β1 x1 + β2 x2 where x1 = {1 if Communist, if not}, x2 = {1 if Democratic, if not} (b) β0 = μDictator , β1 = μCommunist − μDictator , β2 = μDemocratic − μDictator 4.85 no; income (x1 ) not significant, air carrier dummies (x3 –x30 ) are significant 4.87 (a) quantitative (b) quantitative (c) qualitative (d) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 , where x1 = {1 if Benzene, if not}, x2 ={1 if Toluene, if not}, x3 = {1 if Chloroform, if not}, x4 ={1 if Methanol, if not} (e) β0 = μA , β1 = μB − μA , β2 = μT − μA , β3 = μC − μA , β4 = μM − μA (f) F -test with H0 : β1 = β2 = β3 = β4 = 4.89 (a) H0 : β5 = β6 = β7 = β8 = 0; males: F > 2.37; females: F > 2.45 (c) reject H0 for both Chapter 5.1 (a) Qualitative (b) Quantitative (c) Quantitative 5.3 Gender: qualitative; Testimony: qualitative 5.5 (a) Quantitative (b) Quantitative (c) Qualitative (d) Qualitative (e) Qualitative (f) Quantitative (g) Qualitative (h) Qualitative (i) Qualitative (j) Quantitative (k) Qualitative 5.7 (a) (i) 1; (ii) 3; (iii) 1; (iv) (b) (i) E(y) = β0 + β1 x; (ii) E(y) = β0 + β1 x + β2 x + β3 x ; (iii) E(y) = β0 + β1 x; (iv) E(y) = β0 + β1 x + β2 x (c) (i) β1 > 0; (ii) β3 > 0; (iii) β1 < 0; (iv) β2 < 5.9 E(y) = β0 + β1 x + β2 x + β3 x 5.11 E(y) = β0 + β1 x + β2 x 5.15 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 + β4 x12 + β5 x22 (b) E(y) = β0 + β1 x1 + β2 x2 (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (d) β1 + β3 x2 (e) β2 + β3 x1 5.17 (a) E(y) = β0 + β1 x1 + β2 x12 + β3 x2 + β4 x22 + β5 x4 + β6 x42 + β7 x1 x2 + β8 x1 x4 + β9 x2 x4 (b) yˆ = 10, 283.2 + 276.8x1 + 990x12 + 3325.2x2 + 266.6x22 + 1301.3x4 + 40.22x42 + 41.98x1 x2 + 15.98x1 x4 + 207.4x2 x4 ; yes, F = 613.278, reject H0 (c) F = 108.43, reject H0 5.19 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 + β4 x12 + β5 x22 (b) yˆ = 15,583 + 078x1 − 523x2 + 0044x1 x2 − 0000002x12 + 8.84x22 (c) F = 93.55, reject H0 (f) graphs have similar shape 5.21 (a) u = (x − 85.1)/14.81 (b) −.668, 446, 1.026, −1.411, −.223, 1.695, −.527, −.338 (c) 9967 (d) 376 (e) yˆ = 110.953 + 14.377u + 7.425u2 5.23 (a) 975, 928, 987 (b) u = (x − 5.5)/3.03 (c) 0, 923, 0; yes 5.25 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 , where x1 = {1 if low, if not}, x2 = {1 if moderate, if not}, x3 = {1 if high, if not} (b) β0 = μNone , β1 = μLow − μNone , β2 = μMod − μNone , β3 = μHigh − μNone (c) F -test of H0 : β1 = β2 = β3 = 5.27 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 , where x1 = {1 if manual, if automated}, x2 = {1 if clay, if not}, x3 = {1 if gravel, if not} (b) μAutomated/Sand (c) β0 + β1 + β2 + β4 (d) β1 5.29 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 , where x1 = {1 if common, if ambiguous}, x2 = {1 if low, if high} (b) βˆ0 = 6.1, βˆ1 = 2, βˆ2 = 11.9, βˆ3 = −10.4 (c) t-test for H0 : β3 = 5.31 (a) E(y) = β0 + β1 x1 + β2 x12 + β3 x2 + β4 x3 + β5 x1 x2 + β6 x1 x3 + β7 x12 x2 + β8 x12 x3 , where x1 = level of bullying, x2 = {1 if low, if not}, x3 = {1 if neutral, if not} (b) β0 + 25β1 + 625β2 + β3 + 25β5 + 625β7 (c) nested F -test of H0 : β2 = β7 = β8 = (d) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 (e) low: β1 + β4 ; neutral: β1 + β5 ; high: β1 5.33 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 + β6 x6 + β7 x7 + β8 x1 x2 + β9 x1 x3 + β10 x1 x4 + β11 x1 x5 + β12 x1 x6 + β13 x1 x7 + β14 x2 x4 + β15 x2 x5 + β16 x2 x6 + β17 x2 x7 + β18 x3 x4 + β19 x3 x5 + β20 x3 x6 + β21 x3 x7 + β22 x1 x2 x4 + β23 x1 x2 x5 + β24 x1 x2 x6 + β25 x1 x2 x7 + β26 x1 x3 x4 + β27 x1 x3 x5 + β28 x1 x3 x6 + β29 x1 x3 x7 , where x1 = {1 if manual, if automated}, x2 = {1 if clay, if not}, x3 = {1 if gravel, if not}, x4 = {1 if East, if not}, x5 = {1 if South, if not}, x6 = {1 if West, if not}, x7 = {1 if Southeast, if not} (b) μAutomated /Sand/SW (c)β0 + β1 + β2 + β4 + β8 + β10 + β14 + β22 (d) β1 (e) β8 = β9 = β22 = β23 = β24 = β25 = β26 = β27 = β28 = β29 = 5.35 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (c) β1 + β3 (d) no; F = 26, p-value = 857, not reject H0 (e) E(y) = β0 + β1 x1 + β2 x12 + β3 x2 + β4 x1 x2 + β5 x12 x2 5.37 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 (b) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 (c) AL: β1 ; TDS: β1 + β4 ; FE: β1 + β5 (d) nested F -test of H0 : β4 = β5 = 5.39 (a) Qualitative (b) Quantitative (c) Quantitative 5.41 (a) Quantitative (b) Quantitative (c) Qualitative (d) Qualitative (e) Qualitative (f) Qualitative (g) Quantitative (h) Qualitative Answers to Selected Exercises 5.43 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 (b) μBA − μN (c) μE − μN (d) μLAS − μN (e) μJ − μN (f) E(y) = (β0 + β5 ) + β1 x1 + β2 x2 + β3 x3 + β4 x4 (g) μBA − μN (h) μE − μN (i) μLAS − μN (j) μJ − μN (k) μF − μM l Reject H0 : β5 = 0; gender has an effect 5.45 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 , where x1 = {1 if boy, if girl}, x2 = {1 if youngest third, if not}, x3 = {1 if middle third, if not} (b) β0 = μGirls/Oldest , β1 = μBoys − μGirls , β2 = μYoungest − μOldest , β3 = μMiddle − μOldest (c) E(y) = β0 +β1 x1 +β2 x2 +β3 x3 + β4 x1 x2 + β5 x1 x3 (d) 21, −.05, 06, −.03, 11, 20 (e) nested F -test of H0 : β4 = β5 = 5.47 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 , where x1 = {1 if P1 , if P2 }, x2 = {1 if L1 , if not}, x3 = {1 if L2 , if not}, x4 = {1 if L3 , if not} (b) 8; E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x1 x2 + β6 x1 x3 + β7 x1 x4 (c) F = 2.33, not reject H0 5.49 (a) E(y) = β0 + β1 x1 + β2 x2 , where x1 = years of education, x2 = {1 if Certificate, if not} (b) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 (c) E(y) = β0 + β1 x1 + β2 x12 + β3 x2 + β4 x1 x2 + β5 x12 x2 5.51 (a) F = 8.79, p-value = 0096; reject H0 (b) DF-2: 2.14; blended: 4.865; adv timing: 7.815 5.53 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 , where x1 = {1 if Low, if not}, x2 = {1 if Moderate, if not}, x3 = {1 if producer, if consumer} (b)β0 = μHigh/Cons , β1 = μLow − μHigh , β2 = μMod − μHigh , β3 = μProd − μCons (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x3 + β5 x2 x3 (d) β0 = μHigh/Cons , β1 = (μLow − μHigh ) for consumers, β2 = (μMod − μHigh ) for consumers, β3 = (μProd − μCons ) for high, β4 = (μLow − μHigh )Prod − (μLow − μHigh )Cons , β5 = (μMod − μHigh )Prod − (μMod − μHigh )Cons , (e) nested F -test of H0 : β4 = β5 = 5.55 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 (b) t-test of H0 : β3 = vs Ha : β3 < (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 + β4 x12 + β5 x22 + β6 x3 + β7 x1 x3 + β8 x2 x3 + β9 x1 x2 x3 + β10 x12 x3 + β11 x22 x3 (d) nested F -test of H0 : β1 = β3 = β4 = β7 = β9 = β10 = Chapter 6.1 (a) x2 (b) yes (c) fit all possible 2-variable models, E(y) = β0 + β1 x2 + β2 xj 6.3 (a) 11 (b) 10 (c) (d) E(y) = β0 + β1 x11 + β2 x4 + β3 x2 + β4 x7 + β5 x10 + β6 x1 + β7 x9 + β8 x3 (e) 67.7% of sample variation in satisfaction is explained by the model (f) no interactions or higher-order terms tested 6.5 (a) x5 , x6 , and x4 (b) no (c) E(y) = β0 + β1 x4 + β2 x5 + β3 x6 + β4 x4 x5 + β5 x4 x6 + β6 x5 x6 (d) nested F -test of Ho : β4 = β5 = β6 = (e) consider interaction and higher-order terms 6.7 (a) (i) 4; (ii) 6; (iii) 4; (iv) (b) (i) 213, 193.8, 2.5, 10,507; (ii) 247, 189.1, 2.3, 10,494; (iii) 266, 188.2, 3.1, 10,489; (iv) 268, 191.7, 5.0, 10,710 (d) x2 , x3 , x4 6.9 yes; DOT estimate and low-bid-estimate ratio 6.11 Stepwise: well depth and percent of land allocated to industry 785 Chapter 7.1 model less reliable 7.3 (a) x = ln(p) (b) yes, t = −15.89 (c) (924.5, 975.5) 7.5 (a) no (b) no 7.7 (a) no (b) yes 7.9 Unable to test model adequacy since df(Error) = n−3=0 7.11 (a) yˆ = 2.74 + 80x1 ; yes, t = 15.92 (b) yˆ = 1.66 + 12.40x2 ; yes, t = 11.76 (c) yˆ = −11.80 + 25.07x3 ; yes, t = 2.51 (d) yes 7.13 (a) multicollinearity (b) no, β3 not estimable 7.15 no multicollinearity, use DOTEST and LBERATIO 7.17 Two levels each; n > 7.19 yes, high VIFs for Inlet-temp, Airflow and Power; include only one of these variables 7.21 (a) 0025; no (b) 434; no (c) no (d) yˆ = −45.154 + 3.097x1 + 1.032x2 , F = 39,222.34, reject H0 : β1 = β2 = 0; R = 9998 (e) –.8998; high correlation (f) no 7.23 df(Error) = 0, s undefined, no test of model adequacy Chapter 8.1 (a) yˆ = 2.588 + 541x (b) −.406, −.206, −.047, 053, 112, 212, 271, 471, 330, 230, −.411, −.611 (c) Yes; needs curvature 8.3 (a) yˆ = 40.35 − 207x (b) −4.64, −3.94, −1.83, 57, 2.58, 1.28, 4.69, 4.09, 4.39, 2.79, 50, 1.10, −6.09, −5.49 (c) Yes; needs curvature (d) yˆ = −1051 + 66.19x − 1.006x ; yes, t = −11.80 8.5 yˆ = 30, 856 − 191.57x; yes, quadratic trend; yes 8.7 (a) −389, −178, 496, , 651 (b) No trends (c) No trends (d) No trends (e) No trends 8.9 Yes; assumption of constant variance appears satisfied; add curvature 8.11 Yes 8.13 (a) Yes; assumption of equal variances violated √ (b) Use transformation y* = y 8.15 (a) yˆ = 94 − 214x (b) 0, 02, −.026, 034, 088, −.112, −.058, 002, 036, 016 (c) Unequal variances (d) Use √ ˆ = 1.307 − 2496x; yes transformation y* = sin−1 y (e) y* 8.17 (a) Lagos: −.223 (b) yes 8.19 No; remove outliers or normalizing transformation 8.21 Residuals are approximately normal 8.23 No outliers 8.25 Observations #8 and #3 are influential 8.27 No outliers 8.29 Several, including wells 4567 and 7893; both influential; possibly delete 8.31 Inflated t-statistics for testing model parameters 8.33 (b) Model adequate at α = 05 for all banks except bank (c) Reject H0 (two-tailed at α = 05) for banks 2, 5; fail 786 Answers to Selected Exercises to reject H0 for banks 4, 6, 8; test inconclusive for banks 1, 3, 7, 8.35 (a) Yes; residual correlation (b) d = 058, reject H0 (c) Normal errors 8.37 (a) yˆ = 1668.44 + 105.83t; yes, t = 2.11, reject H0 (b) Yes (c) d = 845, reject H0 8.39 (a) Misspecified model; quadratic term missing (b) Unequal variances (c) Outlier (d) Unequal variances (e) Nonnormal errors 8.41 Assumptions reasonably satisfied 8.43 Assumptions reasonably satisfied 8.45 (a) yˆ = −3.94 + 082x (b) R = 372; F = 2.96, p-value = 146, model not useful (d) Obs for horse #3 is outside 2s interval (e) Yes; R = 970; F = 130.71, p-value = 8.47 Possible violation of constant variance assumption 8.49 Violation of constant variance assumption; variancestabilizing transformation ln(y) Chapter 9.1 (a) E(y) = β0 + β1 x1 + β2 (x1 − 15)x2 , where x1 = x and x2 = {1 if x1 > 15, if not} (b) x ≤ 15 : y-intercept = β0 , slope = β1 ; x > 15: y-intercept = β0 − 15β2 , slope = β1 + β2 (c) t-test for H0 : β2 = 9.3 (a) E(y) = β0 + β1 x1 + β2 (x1 − 320)x2 + β3 x2 , where x1 = x and x2 = {1 if x1 > 320, if not} (b) x ≤ 320: y-intercept = β0 , slope = β1 ; x > 320: y-intercept = β0 − 320β2 + β3 , slope = β1 + β2 (c) nested F -test for H0 : β2 = β3 = 9.5 (a) and (b) E(y) = β0 + β1 x1 + β2 (x1 − 4)x2 + β3 (x1 − 7)x3 , where x1 = x, x2 = {1 if x1 > 4, if not}, x3 = {1 if x1 > 7, if not} (c) x ≤ : β1 ; < x ≤ : (β1 + β2 ); x > 7: (β1 + β2 + β3 ) (d) for every 1-point increase in performance over range x ≤ 4, satisfaction increases 5.05 units 9.7 (a) yes; 3.55 (b) E(y) = β0 + β1 x1 + β2 (x1 − 3.55)x2 , where x1 = load and x2 = {1 if load > 3.55, if not} (c) y= ˆ 2.22 + 529x1 + 2.63(x1 − 3.55)x2 ; R = 994, F = 1371, p-value = 0, reject H0 9.9 135.1 ± 74.1 9.11 1.93 ± 16.45 9.13 (a) yˆ = −2.03 + 6.06x (b) t = 10.35, reject H0 (c) 1.985 ± 687 9.15 (a) yˆ = − 3.367 + 194x; yes, t = 4.52 (p-value = 0006) (b) Residuals: −1.03, 6.97, −5.03, −2.03, 1.97, −6.73, 3.27, −5.73, 9.27, −1.73, −9.43, 12.57, −8.43, 2.57, 3.57; as speed increases, variation tends to increase (c) wi = 1/xi2 ; x = 100: 20.7; x = 150: 44.3; x = 200: 84.3 (d) y= ˆ − 3.057 + 192x; s = 044 9.17 (a) yˆ = 140.6 − 67x; assumption violated (b) w = 1/(x) ¯ 9.19 violation of constant error variance assumption; violation of assumption of normal errors; predicted y is not bounded between and 9.21 (a) β1 = change in P(hired) for every 1-year increase in higher education; β2 = change in P(hired) for every 1-year increase in experience; β3 = P(hired)Males - P(hired)Females (b) yˆ = −.5279 + 0750x1 + 0747x2 + 3912x3 (c) F = 21.79, reject H0 (d) yes; t = 4.01 (e) (−.097, 089) 9.23 (a) P(Maryland nappe) (b) π ∗ = β0 + β1 x, where π ∗ = ln{π/(1 − π )} (c) change in log-odds of a Maryland nappe for every 1-degree increase in FIA (d) exp (β0 + 80β1 )/{1 − exp(β0 + 80β1 )} 9.25 (a) χ = 20.43, reject H0 (b) Yes; χ = 4.63 (c) (.00048, 40027) 9.27 (a) P(landfast ice) (b) π ∗ = β0 + β1 x1 + β2 x2 + β3 x3 , where π ∗ = ln{π/(1 − π )} (c) πˆ ∗ = 30 + 4.13x1 + 47.12x2 − 31.14x3 (d) χ = 70.45, reject H0 (e) π ∗ = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 + β6 x2 x3 (f) πˆ ∗ = 6.10−3.00x1 + 10.56x2 − 39.69x3 + 50.49x1 x2 − 6.14 x1 x3 + 56.24x2 x3 (g) χ = 32.19, reject H0 ; interaction model is better Chapter 10 10.1 (a) yes; yes (b) 366.3, 335.8, 310.8, 284.0, 261.5, 237.0, 202.3, 176.8, 155.5 (c) yes (d) 81.9 (e) 113.5 (f) Quarter 1: 94.2; Quarter 2: 107.8 10.3 (a) Moving average: 16.2; Exponential smoothing: 107; Holt-Winters: 35.9 (b) Moving average: 16.2; Exponential smoothing: 109.4; Holt–Winters: 43.9 (c) Moving average 10.5 (a) Yes (b) forecast = 231.7 (c) forecast = 205.4 (d) forecast = 232.8 10.7 (a) yes (b) 2006: 495; 2007: 540 ; 2008: 585 (d) 2006: 435.6 ; 2007: 435.6; 2008: 435.6 (e) 2006: 480.85; 2007: 518.1; 2008: 555.35 (f) moving average most accurate 10.9 (a) βˆ0 = 4.36: estimated price is $4.36 in 1990; βˆ1 = 455: for each additional year, price increases by $.455 (b) t = 8.43, p-value = 0, reject H0 (c) 2008: (9.742, 15.357); 2009: (10.151, 15.858) (d) extrapolation; no account for cyclical trends 10.11 (a) E(yt ) = β0 + β1 t + β2 Q1 + β3 Q2 + β4 Q3 (b) yˆ t = 119.85 + 16.51t + 262.34Q1 + 222.83Q2 + 105.51Q3 ; F = 117.82, p-value = 0, reject H0 (c) independent error (d) Q1 : 728.95, (662.8, 795.1); Q2 : 705.95, (639.8 772.1); Q3 : 605.15, (539.0, 671.3); Q4 : 516.115, (450.0, 582.3) 10.13 (a)yes (b) yˆ t = 39.49 + 19.13t − 1.315t (d) (−31.25, 48.97) 10.15 (a) no, t = −1.39 (b) 2003.48 (c) no, t = −1.61 (d) 1901.81 10.17 (a) 0, 0, 0, 5, 0, 0, 0, 25, 0, 0, 0, 125, 0, 0, 0, 0625, 0, 0, 0, 03125 (b) 5, 25, 125, 0625, 03125, 0156, 10.19 Rt = φ1 Rt−1 + φ2 Rt−2 + φ3 Rt−3 + φ4 Rt−4 + εt 10.21 (a) E(yt ) = β0 +β1 x1t + β2 x2t + β3 x3t + β4 t (b)E(yt ) = β0 + β1 x1t + β2 x2t + β3 x3t + β4 t + β5 x1t t + β6 x2t t + β7 x3t t (c) Rt = φRt−1 + εt 10.23 (a) E(yt ) = β0 + β1 [cos(2π /365)t] +β2 [sin(2π /365)t] (c) E(yt ) = β0 +β1 [cos(2π /365)t] + β2 [sin(2π /365)t] +β3 t + β4 t[cos(2π /365)t] +β5 t[sin(2π /365)t] (d) no; Rt = φRt−1 +εt 10.25 (a) yt = β0 + β1 t + φRt−1 + εt (b) yˆ t = 11, 374 + 160.23t + 3743Rˆ t−1 (d) R = 9874, s = 115.13 Answers to Selected Exercises 10.27 (a) yes, upward trend (b) yt = β0 + β1 t + β2 t + φRt−1 + εt (c) yˆ t = 263.14 + 1.145t + 056t + 792Rˆ t−1 ; R = 747; t = 3.66, p-value = 0004, reject H0 10.29 (a) F49 = 336.91; F50 = 323.41; F51 = 309.46 (b) t = 49: 336.91 ± 6.48; t =50: 323.41 ± 8.34; t = 51: 309.46 ± 9.36 10.35 (a) 2136.2 (b) (1404.3, 3249.7) (c) 1944; (1301.8, 2902.9) 10.37 (a) yes; possibly (b) 64 (d) 86.6 (e) 73.5 (f) yt = β0 + β1 t + β2 S1 + β3 S2 + β4 S3 + φRt−1 + εt , where S1 = {1 if Jan/Feb/Mar, if not}, S2 = {1 if Apr/May/Jun, if not}, S3 = {1 if Jul/Aug/Sep, if not} (g) yˆ t = 101.69 − 1.47t + 13.32S1 + 29.94S2 + 32.45S3 − 543Rˆ t−1 (h)96.1 ±11 10.39 (a) yes, curvilinear trend (b) yt = β0 + β1 t + β2 t + εt (c) yˆ t = 189.03 + 1.38t − 041t (e) yes (f) Durbin– Watson test; d =.96, reject H0 (g) yt = β0 + β1 t + β2 t + φRt−1 + εt ; (h) yˆ t = 188.7 + 1.39t − 04t + 456Rˆ t−1 10.41 (a) E(yt ) = β0 + β1 t + β2 t + β3 x,where x = {1 if Jan– Apr, if not} (b) add interaction terms 10.43 (a) μPost − μPre (b) μPre (c) −.55 (d) 2.53 Chapter 11 11.1 (a) Noise (variability) and volume (n) (b) remove noise from an extraneous source of variation 11.3 (a) cockatiel (b) yes; completely randomized design (c) experimental group (d) 1,2,3 (e) (f) total consumption (g) E(y) = β0 + β1 x1 + β2 x2 , where x1 = {1 if group 1, if not}, x2 = {1 if group 2, if not} 11.5 (a) yB1 = β0 + β2 + β4 + εB1 ; yB2 = β0 + β2 + β5 + εB2 ; ; yB,10 = β0 + β2 + εB,10 ; y¯B = β0 + β2 + (β4 + β5 + · · · + yD1 = β0 + β4 + εD1 ; yD2 = β0 + β5 + β12 )/10 + ε¯ B (b) y¯D = β0 + (β4 + β5 + · · · + εD2 ; ; yD,10 = β0 + εD,10 ; β12 )/10 + ε¯ D 11.7 ability to investigate factor interaction 11.9 (a) students (b) yes; factorial design (c) class standing and type of preparation (d) class standing: low, medium, high; type of preparation: review session and practice exam (e) (low, review), (medium, review), (high, review), (low, practice),(medium, practice), (high, practice) (f) final exam score 11.11 (a) training method, practice session, task consistency (b) task consistency is QL, others are QN (c) 48; (CC/1/100), (AC/1/100), (CC/2/100), (AC/2/100), ,(CC/6/33), (AC/6/33) 11.13 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 , where x1 is dummy variable for QL factor A; x2, x3 are dummy variables for QL factor B (b) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 + β6 x1 x2 + β7 x1 x3 + β8 x1 x4 + β9 x1 x5 + β10 x2 x4 + β11 x2 x5 + β12 x3 x4 + β13 x3 x5 + β14 x1 x2 x4 + β15 x1 x2 x5 + β16 x1 x3 x4 + β17 x1 x3 x5 , where x1 = QN factor A; x2 , x3 are dummy variables for QL factor B; x4 , x5 are dummy variables for QL factor C 11.15 Cannot investigate factor interaction 11.17 11 11.19 sample size (n) and standard deviation of estimator 787 11.21 Step 11.23 treatments: A1 B1 , A1 B2 , A1 B3 , A1 B4 , A2 B1 , A2 B2 , A2 B3 , A2 B4 11.25 E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 ; 10 11.27 (a) Sex and weight (b) Both qualitative (c) ; (ML), (MH), (FL), and (FH) Chapter 12 12.3 (a) E(y) = β0 + β1 x, where x = {1 if treatment 1, if treatment 2} (b) yˆ = 10.667 − 1.524x; t = −1.775, fail to reject H0 12.5 (a) t = −1.78; not reject H0 (c) Two-tailed 12.7 (a) completely randomized design (b) colonies 3, 6, and 12; energy expended (c) H0 : μ1 = μ2 = μ3 = μ4 (d) Reject H0 12.9 (a) H0 : μ1 = μ2 = μ3 (b) E(y) = β0 + β1 x1 + β2 x2 (c) Reject H0 (d) Fail to reject H0 12.11 (a) Source df SS MS F Solvent Error 29 3.3054 1.9553 1.6527 0.0674 24.51 Total 31 5.2607 (b) F = 24.51, reject H0 12.13 F = 7.69, reject H0 12.15 F = 9.97, reject H0 12.17 (a) H0 : μ1 = μ2 = μ3 (b) Source df SS MS F Level Error 72 6.643 527.357 3.322 7.324 0.45 Total 74 534.000 (c) fail to reject H0 12.19 (a) same subjects used in parts A and B (b) response = WTP; treatments = A and B; blocks: subjects (c) treatments: H0 : μA = μB 12.21 No evidence of a difference among the three plant session means; F = 019 12.23 (a) same genes examined across all three conditions (b) H0 : μFull-Dark = μTR-Light = μTR-Dark (c) F = 5.33, reject H0 12.25 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x5 + · · · + β10 x10 , where x1 , x2 and x3 are dummy variables for interventions (treatments), x4 , x5, , x10 are dummy variables for boxers (blocks) (b) E(y) = β0 + β4 x4 + β5 x5 + · · · + β10 x10 , where x4 , x5, , x10 are dummy variables for boxers (blocks) (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 , where x1 , x2 and x3 are dummy variables for interventions (treatments) 12.27 Yes, F = 34.12 788 Answers to Selected Exercises 12.29 (a) factorial design (b) level of coagulant (5, 10, 20, 50, 100, and 200); acidity level (4, 5, 6, 7, 8, and 9); 36 combinations 12.31 (a) df(AGE) = 2, df(BOOK) = 2, df(AGE × BOOK) = 4, df(ERROR)= 99 (b) × = (c) reject H0 ; sufficient evidence of interaction (d) no 12.33 (a) × = (b) difference between mean FSS values for normal sleepers and insomniacs is independent of education level (c) μInsomia >μNormal (d) Mean FSS values at different education levels are significantly different 12.35 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x1 x3 +β6 x1 x4 + β7 x2 x3 + β8 x2 x4 , where x1 and x2 are dummy variables for Antimony, x3 and x4 are dummy variables for Method (b) Source df SS MS F Amount Method Amount×Method Error 3 32 104.19 28.63 25.13 55.25 34.73 9.54 2.79 1.73 20.12 5.53 1.62 Total 47 213.20 level of Agent-to-mineral, if not}, x2 = {1 if high level of collector-to-mineral, if not}, x3 = {1 if high level of liquidto-solid, if not}, x4 = {1 if foaming agent SABO, if PO} (b) df(Error) = (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x1 x2 + β6 x1 x3 +β7 x1 x4 +β8 x2 x3 +β9 x2 x4 + β10 x3 x4 (d) yˆ = 7.03 + 205x1 + 327x2 + 12x3 − 1.09x4 − 038x1 x2 + 137x1 x3 + 183x1 x4 + 042x2 x3 + 428x2 x4 + 282x3 x4 (e) only significant interaction is Collector × Foaming agent (f) perform main effect tests for Agent-to-mineral mass ratio (not significant) and Liquid-to-solid ratio (not significant) E(y) = (β0 + 12.47 (a) E(y) = β0 + β1 x1 + β2 x12 (b) β3 ) + (β1 + β6 )x1 + (β2 + β9 )x12 (c) E(y) = (β0 + β3 + β4 +β5 ) + (β1 + β6 + β7 + β8 ) x1 + (β2 + β9 + β10 + β11 ) x12 (d) yˆ = 31.15 + 153x1 − 00396x12 + 17.05x2 + 1.91x3 − 14.3x2 x3 + 151x1 x2 + 017x1 x3 − 08x1 x2 x3 − 00356x12 x2 + 0006x12 x3 + 0012x12 x2 x3 (e) Rolled/inconel: yˆ = 53+.241x1 −.00572x12 ; Rolled/incoloy: yˆ = 50.25 + 17x1 + 00336x12 ; Drawn/inconel: yˆ = 48.2 + 304x1 − 00752x12 ; Drawn/ incoloy: yˆ = 31.15 + 153x1 − 00396x12 12.49 (a) (b) μ12 μFine >(μMedium , μCoarse ) (c) Do not reject H0 ; F = 1.62 (d) Difference in mean shear strengths for any two levels of antimony amount does not depend on cooling method (e) Amount: reject H0 , F = 20.12; Method: reject H0 , F = 5.53 12.37 (a) yes (b) reject H0 ; evidence of interaction (c) no 12.39 Interaction: fail to reject H0 , F = 1.77; Preparation: reject H0 , F = 14.40; Standing: fail to reject H0 , F = 2.17 12.41 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 , where x1 = {1 if Low load, if High load}, x2 = {1 if Ambiguous, if Common} (b) βˆ0 = y High/Common = 6.3, βˆ1 = y Low/Common − y High/Common = 1.5, βˆ2 = y Ambig/High − y Common/High = −.2, βˆ3 = (y Low/Common −y High/Common )−(y Low/Ambig −y High/Ambig ) = −10.2 (c) 9,120.25 (d) SS(Load) = 1,222.25; SS(Name)= 625; SS(Load × Name) = 676 (e) 5,400; 2,166; 2,166; 2,400 (f) 12,132 (g) 14,555.25 (h) Source df SS MS F Load Name Load×Name Error Total 1 96 99 1122.25 625.00 676.00 12,132.00 14,555.25 1122.25 625.00 675.00 126.375 8.88 4.95 5.35 12.43 (a) dissolution time (b) Binding agent (khaya gum, PVP), Concentration (.5%, 4%), and Relative density (low, high) (c) factorial design (d) (e) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x1 x2 + β5 x1 x3 + β6 x2 x3 + β7 x1 x2 x3 , where x1 = {1 if khaya gum, if PVP}, x2 = {1 if 5%, if %}, and x3 = {1 if low density, if high density} (f) yes 12.45 (a) E(y) = β0 + β1 x1 + β2 x2 + β3 x3 + β4 x4 + β5 x1 x2 + β6 x1 x3 + β7 x1 x4 + β8 x2 x3 + β9 x2 x4 +β10 x3 x4 + β11 x1 x2 x3 + β12 x1 x2 x4 + β13 x2 x3 x4 + β14 x1 x2 x3 x4 , where x1 = {1 if high 12.53 (a) Policy mean differs from each of policies 3–18; differs from 4–18; differs from 5–18; differs from 8–18; 5, 6, and differ from 9–18; differs from 12–18; 9, 10, and 11 differ from 16–18 (b) Yes 12.55 ω = 1.82; (μ5 , μ3 , μ0 )>μ10 12.57 (a) μWhat−B >(μHow−A, μWho−B, μWho−A ); (μWho-C , μHow-C )>( μWho-B, μWho-A ) (b) probability of making at least one Type I error in the multiple comparisons 12.59 (a) yes (b) yes (c) no (d) 05 (e) no significant differences in means found for girls 12.61 (a) Reject H0 (b) μQ >(μS , μC ) 12.63 (a) mean number of alternatives differ for the emotional states (b) design not balanced (c) probability of making at least one Type I error (d) μGuilt >(μNeutral , μAngry ) 12.65 (μMRB-2 , μMRB-3 )>( μSD, μSWRA ); μMRB-1 >μSWRA 12.67 variance assumption violated 12.69 approximately satisfied 12.71 approximately satisfied 12.73 (a) completely randomized (b) A/R, A/P, Control (c) df(Groups) = 2, df(Error) = 42, SSE = 321.47, MST = 35.755, F = 4.67 (d) reject H0 : μA/R = μA/P = μControl (e) partially; (μA/R , μControl )(μF , μS ); μA >μS 12.77 (a) F = 0.61, fail to reject H0 (b) F = 0.61, fail to reject H0 12.79 (a) Evidence of N × I interaction; ignore tests for main effects (b) Agree; interaction implies differences among N means depend on level of I 12.81 (a) Five insecticides; seven locations (b) Fail to reject H0 at α = 10, F = 2.11; no Answers to Selected Exercises 12.83 (a) df(A) = 1, df(C) = 2, df(A × C) = 2, df(Total) = 134 (b) Reject H0 (c) Reject H0 (d) main effect tests assume no interaction 12.85 (a) luckiness (lucky, unlucky, and uncertain); competition (competitive and noncompetitive) (b) Interaction: F = 72, not reject H0 ; Luckiness: F = 1.39, not reject H0 ; Competition: F = 2.84, not reject H0 12.87 (a) Factorial (b) No replications (c) E(y) = β0 + β1 x1 + β2 x2 + β3 x1 x2 + β4 x12 + β5 x22 (d) Test H0 : β3 = (e) yˆ = −384.75 + 3.73x1 + 12.72x2 − 05x1 x2 − 009x12 − 322x22 ; t = −2.05, reject H0 (p-value = 07) 12.89 (a) (Down, left), (Down, right), (Control, left), (Control, right) (d) reject H0 (e) μDL