0% found this document useful (0 votes)
28 views36 pages

Chapter 4 Thomas Managerial Economics

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views36 pages

Chapter 4 Thomas Managerial Economics

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 36

Because learning changes everything.

Chapter 4
Basic Estimation
Techniques

© 2020 McGraw-Hill Education. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw-Hill Education.
Learning Objectives
 Set up and interpret simple linear regression equations.
 Estimate intercept and slope parameters of a regression
line using the method of least‐squares.
 Determine statistical significance using either t‐tests or p‐
values associated with parameter estimates.
 Evaluate the “fit” of a regression equation to the data
using the R2 statistic and test for statistical significance of
the whole regression equation using an F‐test.
 Set up and interpret multiple regression models.
 Use linear regression techniques to estimate the
parameters of two common nonlinear models: quadratic
and log‐linear regression models.

© McGraw-Hill Education 2
Basic Estimation

Parameters
• The coefficients in an equation that
determine the exact mathematical relation
among the variables.
Parameter estimation
• The process of finding estimates of the
numerical values of the parameters of an
equation.
© McGraw-Hill Education
Regression Analysis
Regression analysis
• A statistical technique for estimating the
parameters of an equation and testing for
statistical significance.
Dependent variable
• Variable whose variation is to be explained.
Explanatory variables
• Variables that are thought to cause the
dependent variable to take on different values.
© McGraw-Hill Education
Simple Linear Regression
True regression line relates dependent variable Y to one
explanatory (or independent) variable X

• Intercept parameter (a) gives value of Y where


regression line crosses Y-axis (value of Y when X is
zero)
• Slope parameter (b) gives the change in Y associated
with a one-unit change in X:

Y/
© McGraw-Hill Education
A Hypothetical Regression Model
• Regression line shows the average or
expected value of Y for each level of X
• True (or actual) underlying relation between Y
and X is unknown to the researcher but is to
be discovered by analyzing the sample data
• Random error term
• Unobservable term added to a regression model to
capture the effects of all the minor, unpredictable factors
that affect Y but cannot reasonably by included as
explanatory variables

© McGraw-Hill Education
Figure 4.1 The True Regression Line: Relating
Sales and Advertising Expenditures

Figure 4.1

© McGraw-Hill Education 7
Table 4.1 The Impact of Random Effects on
January Sales

© McGraw-Hill Education 8
Data
Time series
• A data set in which the data for the dependent and
explanatory variables are collected over time for a
single firm.
Cross-sectional
• A data set in which the data for the dependent and
explanatory variables are collected from many different
firms or industries at a given point in time.
Scatter diagram
• A graph of the data points in a sample.

© McGraw-Hill Education
Fitting a Regression Line
The population regression line is the equation or line representing the
true (or actual) underlying relation between the dependent
variable and the explanatory variable(s).

The sample regression line is an estimate of the true (or population)


regression line and represents the line that best fits the scatter of
data points in the sample.

Where and are the fitted or predicted values of the true


(population) parameters a and b and is the fitted or predicted
value of Y. The predicted value of Y is obtained by substituting
a value of X into the sample regression equation.
© McGraw-Hill Education
Table 4.2 Sales and Advertising Expenditures
for a Sample of Seven Travel Agencies

© McGraw-Hill Education 11
Figure 4.2 The Sample Regression Line: Relating Sales
and Advertising Expenditures

Figure 4.2

© McGraw-Hill Education 12
Method of Least Squares
Method of least squares is a method of
estimating the parameters of a linear
regression equation by finding the line
that minimizes the sum of the squared
distances from each sample data point
to the sample regression line.

© McGraw-Hill Education
Parameter Estimates
• Estimators are the formulas by which the
estimates of parameters are computed.
• Parameter estimates are obtained by
substituting sample data in to estimators (they
are the values of a and b that minimize the sum
of squared residuals).
• The residual is the difference between the
actual and fitted values of Y: Yi – Ŷi
• Process is equivalent to fitting a line through a
scatter diagram of the sample data points.
© McGraw-Hill Education
Table 4.3 Examples of Printouts for Regression
Analysis (panel A: “Generic” style)

Table 4.3

© McGraw-Hill Education 15
Table 4.3 Examples of Printouts for Regression
Analysis (panel B: Microsoft Excel)

Table 4.3

© McGraw-Hill Education 16
Statistical Significance

Statistical significance
• There is sufficient evidence from the
sample to indicate that the true value of
the coefficient is not zero.
Hypothesis testing
• A statistical technique for making a
probabilistic statement about the true
value of a parameter.

© McGraw-Hill Education
Figure 4.3 Relative Frequency Distribution for b
when b = 5

Figure 4.3

© McGraw-Hill Education 18
Unbiased Estimators
• The estimates and do not generally equal
the true values of a and b
• and are random variables computed using data
from a random sample
• An estimator is unbiased if it produces
estimates of a parameter that are, on
average, equal to the true value of the
parameter.

© McGraw-Hill Education
Testing for Statistical Significance
• Must determine if there is sufficient statistical
evidence to indicate that Y is truly related to
X (i.e., b  0)
• Even if b = 0, it is possible that the sample
will produce an estimate that is different from
zero
• Test for statistical significance using t-tests or
p-values

© McGraw-Hill Education
Level of Confidence and Level of
Significance
• Determine the level of significance, which is the
probability of finding a parameter estimate to be
statistically different from zero when, in fact, it is
zero
• Type I error is when a parameter estimate is found
to be statistically-significant when it is not.
• Level of confidence is the probability of correctly
failing to reject the true hypothesis that b = 0; equal
to:
1 – level of significance = level of confidence

© McGraw-Hill Education
A t-Test
A t-test is a statistical test used to test the hypothesis
that the true value of a parameter is equal to zero
(b = 0)

t-ratio is computed as

is the standard error of the estimate

A t-statistic is the numerical value of the t-ratio

© McGraw-Hill Education
Performing a t-Test

Use t-table to choose critical t-value with n – k degrees


of freedom for the chosen level of significance.
• The critical value of t is the value that the t-statistic must
exceed in order to reject the hypothesis that b = 0.
• Degrees of freedom is the number of observations in the
sample minus the number of parameters being estimated by
the regression analysis (n – k).

If the absolute value of t-statistic is greater than the


critical t, the parameter estimate is statistically
significant at the given level of significance.
© McGraw-Hill Education
Using p-Values
A p-value gives the exact level of significance for a
test statistic, which is the probability of finding
significance when none exists.

Treat as statistically significant only those


parameter estimates with p-values smaller than
the maximum acceptable significance level.

© McGraw-Hill Education
Coefficient of Determination (R ) 2

R2 measures the fraction of total variation in the


dependent variable (Y) that is explained by the
regression equation (or explained by the
variation in X).
• Ranges from 0 to 1.

• High R2 indicates Y and X are highly correlated, and


does not prove that Y and X are causally related

© McGraw-Hill Education
Figure 4.4 High and Low Correlation

Figure 4.4 and Figure 4.5

© McGraw-Hill Education 26
F-Statistic
The F-statistic is used to test for significance of overall
regression equation.

Compare F-statistic to critical F-value from the F-table


• Two degrees of freedom, n – k & k – 1

• Specify level of significance


• If F-statistic exceeds the critical F, the regression equation
overall is statistically significant at the specified level of
significance.

© McGraw-Hill Education
Multiple Regression
• Regression models that use more than
one explanatory variable to explain the
variation in the dependent variable.
• Coefficient for each explanatory variable
measures the change in the dependent
variable associated with a one-unit
change in that explanatory variable,
holding all else constant.

© McGraw-Hill Education
Quadratic Regression Models
A quadratic regression model is a nonlinear
regression model:
Y = a + bX + cX2
• Use when curve fitting scatter plot is U-
shaped or ∩-shaped
• For linear transformation compute new
variable Z = X2
• Estimate Y = a + bX + cZ
© McGraw-Hill Education
Figure 4.5 A Quadratic Regression Equation

Figure 4.4 and Figure 4.5

© McGraw-Hill Education 30
Log-Linear Regression Models
A nonlinear regression model of the form: Y = aXbZc

Transform by taking natural logarithms:

b and c are elasticities

© McGraw-Hill Education
Figure 4.6 A Log-Linear Regression
Equation (panel A)

Figure 4.6

© McGraw-Hill Education 32
Figure 4.6 A Log-Linear Regression Equation (panel B)

Figure 4.6

© McGraw-Hill Education 33
Summary (1 of 2)
• A simple linear regression model relates a dependent
variable Y to a single explanatory variable X.
• The regression equation is correctly interpreted as providing the
average value (expected value) of Y for a given value of X.
• Parameter estimates are obtained by choosing values of
a and b that create the best-fitting line that passes
through the scatter diagram of the sample data points.
• If the absolute value of the t-ratio is greater (less) than the
critical t-value, then is (is not) statistically significant.
• Exact level of significance associated with a t-statistic is its p-value.
• A high R2 indicates Y and X are highly correlated and the
data tightly fit the sample regression line.
© McGraw-Hill Education
Summary (2 of 2)
• If the F-statistic exceeds the critical F-value, the regression
equation is statistically significant.
• In multiple regression, the coefficients measure the change
in Y associated with a one-unit change in that explanatory
variable.
• Quadratic regression models are appropriate when the
curve fitting the scatter plot is U-shaped or ∩-shaped (Y =
a + bX + cX2).
• Log-linear regression models are appropriate when the
relation is in multiplicative exponential form (Y = aXbZc).
• The equation is transformed by taking natural logarithms.
© McGraw-Hill Education
End of Main Content

Because learning changes everything. ®

www.mheducation.com

© 2020 McGraw-Hill Education. All rights reserved. Authorized only for instructor use in the classroom.
No reproduction or further distribution permitted without the prior written consent of McGraw-Hill Education.

You might also like