Chapter 4 Thomas Managerial Economics
Chapter 4 Thomas Managerial Economics
Chapter 4
Basic Estimation
Techniques
© 2020 McGraw-Hill Education. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw-Hill Education.
Learning Objectives
Set up and interpret simple linear regression equations.
Estimate intercept and slope parameters of a regression
line using the method of least‐squares.
Determine statistical significance using either t‐tests or p‐
values associated with parameter estimates.
Evaluate the “fit” of a regression equation to the data
using the R2 statistic and test for statistical significance of
the whole regression equation using an F‐test.
Set up and interpret multiple regression models.
Use linear regression techniques to estimate the
parameters of two common nonlinear models: quadratic
and log‐linear regression models.
© McGraw-Hill Education 2
Basic Estimation
Parameters
• The coefficients in an equation that
determine the exact mathematical relation
among the variables.
Parameter estimation
• The process of finding estimates of the
numerical values of the parameters of an
equation.
© McGraw-Hill Education
Regression Analysis
Regression analysis
• A statistical technique for estimating the
parameters of an equation and testing for
statistical significance.
Dependent variable
• Variable whose variation is to be explained.
Explanatory variables
• Variables that are thought to cause the
dependent variable to take on different values.
© McGraw-Hill Education
Simple Linear Regression
True regression line relates dependent variable Y to one
explanatory (or independent) variable X
Y/
© McGraw-Hill Education
A Hypothetical Regression Model
• Regression line shows the average or
expected value of Y for each level of X
• True (or actual) underlying relation between Y
and X is unknown to the researcher but is to
be discovered by analyzing the sample data
• Random error term
• Unobservable term added to a regression model to
capture the effects of all the minor, unpredictable factors
that affect Y but cannot reasonably by included as
explanatory variables
© McGraw-Hill Education
Figure 4.1 The True Regression Line: Relating
Sales and Advertising Expenditures
Figure 4.1
© McGraw-Hill Education 7
Table 4.1 The Impact of Random Effects on
January Sales
© McGraw-Hill Education 8
Data
Time series
• A data set in which the data for the dependent and
explanatory variables are collected over time for a
single firm.
Cross-sectional
• A data set in which the data for the dependent and
explanatory variables are collected from many different
firms or industries at a given point in time.
Scatter diagram
• A graph of the data points in a sample.
© McGraw-Hill Education
Fitting a Regression Line
The population regression line is the equation or line representing the
true (or actual) underlying relation between the dependent
variable and the explanatory variable(s).
© McGraw-Hill Education 11
Figure 4.2 The Sample Regression Line: Relating Sales
and Advertising Expenditures
Figure 4.2
© McGraw-Hill Education 12
Method of Least Squares
Method of least squares is a method of
estimating the parameters of a linear
regression equation by finding the line
that minimizes the sum of the squared
distances from each sample data point
to the sample regression line.
© McGraw-Hill Education
Parameter Estimates
• Estimators are the formulas by which the
estimates of parameters are computed.
• Parameter estimates are obtained by
substituting sample data in to estimators (they
are the values of a and b that minimize the sum
of squared residuals).
• The residual is the difference between the
actual and fitted values of Y: Yi – Ŷi
• Process is equivalent to fitting a line through a
scatter diagram of the sample data points.
© McGraw-Hill Education
Table 4.3 Examples of Printouts for Regression
Analysis (panel A: “Generic” style)
Table 4.3
© McGraw-Hill Education 15
Table 4.3 Examples of Printouts for Regression
Analysis (panel B: Microsoft Excel)
Table 4.3
© McGraw-Hill Education 16
Statistical Significance
Statistical significance
• There is sufficient evidence from the
sample to indicate that the true value of
the coefficient is not zero.
Hypothesis testing
• A statistical technique for making a
probabilistic statement about the true
value of a parameter.
© McGraw-Hill Education
Figure 4.3 Relative Frequency Distribution for b
when b = 5
Figure 4.3
© McGraw-Hill Education 18
Unbiased Estimators
• The estimates and do not generally equal
the true values of a and b
• and are random variables computed using data
from a random sample
• An estimator is unbiased if it produces
estimates of a parameter that are, on
average, equal to the true value of the
parameter.
© McGraw-Hill Education
Testing for Statistical Significance
• Must determine if there is sufficient statistical
evidence to indicate that Y is truly related to
X (i.e., b 0)
• Even if b = 0, it is possible that the sample
will produce an estimate that is different from
zero
• Test for statistical significance using t-tests or
p-values
© McGraw-Hill Education
Level of Confidence and Level of
Significance
• Determine the level of significance, which is the
probability of finding a parameter estimate to be
statistically different from zero when, in fact, it is
zero
• Type I error is when a parameter estimate is found
to be statistically-significant when it is not.
• Level of confidence is the probability of correctly
failing to reject the true hypothesis that b = 0; equal
to:
1 – level of significance = level of confidence
© McGraw-Hill Education
A t-Test
A t-test is a statistical test used to test the hypothesis
that the true value of a parameter is equal to zero
(b = 0)
t-ratio is computed as
© McGraw-Hill Education
Performing a t-Test
© McGraw-Hill Education
Coefficient of Determination (R ) 2
© McGraw-Hill Education
Figure 4.4 High and Low Correlation
© McGraw-Hill Education 26
F-Statistic
The F-statistic is used to test for significance of overall
regression equation.
© McGraw-Hill Education
Multiple Regression
• Regression models that use more than
one explanatory variable to explain the
variation in the dependent variable.
• Coefficient for each explanatory variable
measures the change in the dependent
variable associated with a one-unit
change in that explanatory variable,
holding all else constant.
© McGraw-Hill Education
Quadratic Regression Models
A quadratic regression model is a nonlinear
regression model:
Y = a + bX + cX2
• Use when curve fitting scatter plot is U-
shaped or ∩-shaped
• For linear transformation compute new
variable Z = X2
• Estimate Y = a + bX + cZ
© McGraw-Hill Education
Figure 4.5 A Quadratic Regression Equation
© McGraw-Hill Education 30
Log-Linear Regression Models
A nonlinear regression model of the form: Y = aXbZc
© McGraw-Hill Education
Figure 4.6 A Log-Linear Regression
Equation (panel A)
Figure 4.6
© McGraw-Hill Education 32
Figure 4.6 A Log-Linear Regression Equation (panel B)
Figure 4.6
© McGraw-Hill Education 33
Summary (1 of 2)
• A simple linear regression model relates a dependent
variable Y to a single explanatory variable X.
• The regression equation is correctly interpreted as providing the
average value (expected value) of Y for a given value of X.
• Parameter estimates are obtained by choosing values of
a and b that create the best-fitting line that passes
through the scatter diagram of the sample data points.
• If the absolute value of the t-ratio is greater (less) than the
critical t-value, then is (is not) statistically significant.
• Exact level of significance associated with a t-statistic is its p-value.
• A high R2 indicates Y and X are highly correlated and the
data tightly fit the sample regression line.
© McGraw-Hill Education
Summary (2 of 2)
• If the F-statistic exceeds the critical F-value, the regression
equation is statistically significant.
• In multiple regression, the coefficients measure the change
in Y associated with a one-unit change in that explanatory
variable.
• Quadratic regression models are appropriate when the
curve fitting the scatter plot is U-shaped or ∩-shaped (Y =
a + bX + cX2).
• Log-linear regression models are appropriate when the
relation is in multiplicative exponential form (Y = aXbZc).
• The equation is transformed by taking natural logarithms.
© McGraw-Hill Education
End of Main Content
www.mheducation.com
© 2020 McGraw-Hill Education. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw-Hill Education.