Statistics for Business and Economics 7th Edition Chapter 12 Multiple Regression Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-1 Chapter Goals After completing this chapter, you should be able to: Apply multiple regression analysis to business decisionmaking situations Analyze and interpret the computer output for a multiple regression model Perform a hypothesis test for all regression coefficients or for a subset of coefficients Fit and interpret nonlinear regression models Incorporate qualitative variables into the regression model by using dummy variables Discuss model specification and analyze residuals Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-2 12.1 The Multiple Regression Model Idea: Examine the linear relationship between dependent (Y) & or more independent variables (X i) Multiple Regression Model with k Independent Variables: Y-intercept Population slopes Random Error Y = β0 + β1X1 + β X + + βk Xk + ε Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-3 Multiple Regression Equation The coefficients of the multiple regression model are estimated using sample data Multiple regression equation with k independent variables: Estimated (or predicted) value of y Estimated intercept Estimated slope coefficients yˆ i = b0 + b1x1i + b x 2i + + bk x ki In this chapter we will always use a computer to obtain the regression slope coefficients and other regression summary measures Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-4 Multiple Regression Equation (continued) Two variable model y pe o Sl fo i ar v r le ab yˆ = b0 + b1x1 + b x x1 x2 iable x r a v r o f Slope x1 Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-5 Multiple Regression Model Two variable model y yi Sample observation yˆ = b0 + b1x1 + b x < < yi ei = (yi – yi) x2i x1 Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall < x1i x2 The best fit equation, y , is found by minimizing the sum of squared errors, Σe2 Ch 12-6 Standard Multiple Regression Assumptions The values xi and the error terms εi are independent The error terms are random variables with mean and a constant variance, σ2 E[ε i ] = and E[ε i2 ] = σ for (i = 1, , n) (The constant variance property is called homoscedasticity) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-7 Standard Multiple Regression Assumptions (continued) The random error terms, εi , are not correlated with one another, so that E[ε iε j ] = for all i ≠ j It is not possible to find a set of numbers, c0, c1, , ck, such that c + c 1x1i + c x 2i + + c K x Ki = (This is the property of no linear relation for the Xj’s) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-8 Example: Independent Variables A distributor of frozen desert pies wants to evaluate factors thought to influence demand Dependent variable: Pie sales (units per week) Independent variables: Advertising ($100’s) Price (in $) Data are collected for 15 weeks Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-9 Pie Sales Example Week Pie Sales Price ($) Advertising ($100s) 350 5.50 3.3 460 7.50 3.3 350 8.00 3.0 430 8.00 4.5 350 6.80 3.0 380 7.50 4.0 430 4.50 3.0 470 6.40 3.7 450 7.00 3.5 10 490 5.00 4.0 11 340 7.20 3.5 12 300 7.90 3.2 13 440 5.90 4.0 14 450 5.00 3.5 15 300 7.00 2.7 Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Multiple regression equation: Sales = b0 + b1 (Price) + b2 (Advertising) Ch 12-10 Example: Quadratic Model (continued) Simple regression results: ^y = -11.283 + 5.985 Time Coefficients Standard Error -11.28267 3.46805 -3.25332 0.00691 5.98520 0.30966 19.32819 2.078E-10 Intercept Time t Stat P-value Regression Statistics R Square 0.96888 Adjusted R Square 0.96628 Standard Error 6.15997 F Significance F 373.57904 2.0778E-10 t statistic, F statistic, and R2 are all high, but the residuals are not random: Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-44 Example: Quadratic Model (continued) Quadratic regression results: ^y = 1.539 + 1.565 Time + 0.245 (Time)2 Coefficients Standard Error Intercept 1.53870 2.24465 0.68550 0.50722 Time 1.56496 0.60179 2.60052 0.02467 Time-squared 0.24516 0.03258 7.52406 1.165E-05 Regression Statistics R Square 0.99494 Adjusted R Square 0.99402 Standard Error 2.59513 t Stat F P-value Significance F 1080.7330 2.368E-13 The quadratic term is significant and improves the model: R2 is higher and se is lower, residuals are now random Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-45 The Log Transformation The Multiplicative Model: Original multiplicative model Y = β0 X X ε β1 β2 Transformed multiplicative model log(Y) = log(β0 ) + β1log(X1 ) + β 2log(X ) + log(ε) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-46 Interpretation of coefficients For the multiplicative model: log Yi = log β0 + β1 log X1i + log ε i When both dependent and independent variables are logged: The coefficient of the independent variable X k can be interpreted as a percent change in Xk leads to an estimated bk percentage change in the average value of Y bk is the elasticity of Y with respect to a change in Xk Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-47 12.8 Dummy Variables A dummy variable is a categorical independent variable with two levels: yes or no, on or off, male or female recorded as or Regression intercepts are different if the variable is significant Assumes equal slopes for other variables If more than two levels, the number of dummy variables needed is (number of levels - 1) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-48 Dummy Variable Example yˆ = b0 + b1x1 + b x Let: y = Pie Sales x1 = Price x2 = Holiday (X2 = if a holiday occurred during the week) (X2 = if there was no holiday that week) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-49 Dummy Variable Example (continued) yˆ = b0 + b1x1 + b (1) = (b0 + b ) + b1x1 yˆ = b0 + b1x1 + b (0) = Different intercept y (sales) b0 + b b0 + b1 x b0 Holi day (x2 = 1) No H olida y (x = Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall 0) Holiday No Holiday Same slope If H0: β2 = is rejected, then “Holiday” has a significant effect on pie sales x1 (Price) Ch 12-50 Interpreting the Dummy Variable Coefficient Example: Sales = 300 - 30(Price) + 15(Holiday) Sales: number of pies sold per week Price: pie price in $ If a holiday occurred during the week Holiday: If no holiday occurred b2 = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-51 Interaction Between Explanatory Variables Hypothesizes interaction between pairs of x variables Response to one x variable may vary at different levels of another x variable Contains two-way cross product terms yˆ = b0 + b1x1 + b x + b3 x = b0 + b1x1 + b x + b3 (x1x ) Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-52 Effect of Interaction Given: Y = β0 + β X + (β1 + β3 X )X1 = β0 + β1X1 + β X + β3 X1X Without interaction term, effect of X1 on Y is measured by β1 With interaction term, effect of X1 on Y is measured by β1 + β3 X2 Effect changes as X2 changes Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-53 Interaction Example Suppose x2 is a dummy variable and the estimated regression equation is yˆ = 1+ 2x1 + 3x + 4x1x y 12 x2 = 1: ^y = + 2x + 3(1) + 4x (1) = + 6x 1 x2 = 0: ^y = + 2x + 3(0) + 4x (0) = + 2x 1 0 0.5 1.5 x1 Slopes are different if the effect of x1 on y depends on x2 value Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-54 Significance of Interaction Term The coefficient b3 is an estimate of the difference in the coefficient of x1 when x2 = compared to when x2 = The t statistic for b3 can be used to test the hypothesis H0 : β3 = | β1 ≠ 0, β ≠ H1 : β3 ≠ | β1 ≠ 0, β ≠ If we reject the null hypothesis we conclude that there is a difference in the slope coefficient for the two subgroups Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-55 12.9 Multiple Regression Assumptions Errors (residuals) from the regression model: < ei = (yi – yi) Assumptions: The errors are normally distributed Errors have a constant variance The model errors are independent Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-56 Analysis of Residuals in Multiple Regression These residual plots are used in multiple regression: < Residuals vs yi Residuals vs x1i Residuals vs x2i Residuals vs time (if time series data) Use the residual plots to check for violations of regression assumptions Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-57 Chapter Summary Developed the multiple regression model Tested the significance of the multiple regression model Discussed adjusted R2 ( R2 ) Tested individual regression coefficients Tested portions of the regression model Used quadratic terms and log transformations in regression models Explained dummy variables Evaluated interaction effects Discussed using residual plots to check model assumptions Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12-58 .. .Chapter Goals After completing this chapter, you should be able to: Apply multiple regression analysis to business decisionmaking situations Analyze and interpret the computer output for. .. 74.131: sales will increase, on average, by 74.131 pies per week for each $100 increase in advertising, net of the effects of changes due to price Ch 12- 13 12. 3 Coefficient of Determination, R2... estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price Copyright © 2010 Pearson Education, Inc Publishing as Prentice Hall Ch 12- 27 12. 5 Test on All Coefficients