0% found this document useful (0 votes)
15 views

multicollinearity

Uploaded by

missnirumai
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

multicollinearity

Uploaded by

missnirumai
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Muliticollinearity

Multicollinearity
• An important assumption for the application of
least squares is that the explanatory variables are
not perfectly linearly correlated.
• Multicollinearity is a phenomenon inherent in
most economics relationships due to the nature
of economic magnitude.
• When any two explanatory variables are changing
in nearly the same way, it became extremely
difficult to establish the influence of each one
regressor on Y separately.
Plausibility of the assumption
• Strictly speaking the assumption concerning
multicollinearity is easily met in practice,
because it is very rare for any two variables to
be exactly intercorrelated in a linear form.
• However, the estimates of least squares may
be seriously affected with a less than perfect
intercorrelation between the explanatory
variables.
Plausibility of the assumption
• Multicollinearity may arise due to many
reasons.
• Firstly, there is a tendency of economic
variables to move together over time.
• Economic magnitudes are influenced by the
same factors and in consequence once these
determining factors become operative the
economic variables show the same broad
pattern of behaviour over time.
Plausibility of the assumption
• For example, income, consumption, savings,
investment, prices, employment tend to rise in
same periods of economic expansion and
decrease in same period of recession.
• Secondly, the use of lagged values of same
explanatory variables as separate independent
factors in the relationship.
• For example, in consumption functions it has
become customary to include among the
explanatory variables past as well as the present
levels of income.
Consequences
• The estimates of the coefficients are
indeterminate
• The standard errors of these estimates
become infinitely large
• Some coefficients may be over- estimated
while some important coefficients under-
estimated.
• Arbitrarily large R square value
Detecting Multicollinearity
1. Frisch’s confluence analysis
• A combination of the following criteria may help
to detect multicollinearity
➢ Large standard errors of the estimates
➢ Partial correlation coefficients
➢ High R square but the results may be highly
imprecise and insignificant.
(None of the above criteria by itself is a satisfactory
indicator of multicollinearity, because these can
happen by various other reasons)
Detecting Multicollinearity
2. Farrar-Glauber Test
It is a set of three tests. The first test is a chi
square test for the detection of the existence
and severity of multicollinearity. The second is
an F test for locating which variables are
multicollinear. The third test is t-test for finding
the pattern of multicollinearity.
Detecting Multicollinearity
3. Variance Inflation factor(VIF) and Tolerance
These factors measure how much the variances of
the estimated regression coefficients are inflated as
compared to when the independent variables are
not linearly related.
If there are p-1 explanatory variables, the VIF of th k
th variable (VIF)k = (1-Rk2)-1 ,k=1,2…,p-1. Where Rk2 is
the coefficient of multiple determination when Xk is
regressed on the p-2 other explanatory variables.
Detecting Multicollinearity
• If VIF is greater then 10, a severe
multicollinearity is to be suspected.
• Tolerence is the reciprocal of VIF. If its value is
less than 0.1, it is an indicator of
multicollinearity.
Detecting Multicollinearity
4. Informal methods
• Scatter matrix
• Large changes in the estimated regression
coefficients when a variable is added or
deleted or when an observation is altered or
deleted.
• Non-significant results in individual test for
the regression coefficients for important
independent variables.
Detecting Multicollinearity
• Estimated regression coefficients with an
algebraic sign that is the opposite of that
expected from theoretical considerations.
Solutions
• Incorporating extraneous quantitative
information. The most important methods are
➢Method of Restricted least squares
➢Method of pooling cross-section and time
series data
➢Generalized least squares
Solutions
• Increase the size of sample. ( By increasing
more observations the multicollinearity may
be avoided or reduced. The idea is covariance
can be reduced by including more
observations because covariance is inversely
proportional to sample size. But it is true only
if the multicollinearity is due to errors of
measurement.)
Solutions
• Introducing additional equations in the model
(Multicollinearity may overcome if we introduce
additional variables to express meaningful
relations between the multicollinear variables.
Then simultaneous equation models can be
adopted)
• Application of principal components ( By PCA
multicollinearity can be avoided. But it uses less
information than in the sample and
interpretation of the effect of each explanatory
variable will be difficult)
• Distributed lag models ( By adding lagged
variables, multicollinearity may be reduced)

You might also like