0% found this document useful (0 votes)
31 views16 pages

10 Polynomial Regression

Uploaded by

keshavkeshav2608
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views16 pages

10 Polynomial Regression

Uploaded by

keshavkeshav2608
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Polynomial Regression

Dr. Singara Singh Kasana


Associate Professor
Computer Science and Engineering Department
Thapar Institute of Engineering and Technology
Patiala, Punjab
Polynomial Regression

 Simple and multiple linear regressions are used when

the dependent(input) feature changes linearly with

respect to independent feature(s).

 In some real time applications, dependent feature

may not change linearly with the changes in the

independent feature(s)

 For such type of datasets, we can’t use simple or

multiple linear regressions. We need to find suitable

regression that fits with the dataset.


Representation of Different Regression Models
Simple Linear Regression
Y=𝛽0 + 𝛽1 ∗ 𝑥1

Multiple Linear Regression


Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥2+. . . +𝛽𝑛 𝑥n

Polynomial Regression
Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12+. . . +𝛽𝑛 𝑥 1n
Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12+. . . +𝛽𝑛 𝑥 1n

If the degree of polynomial is 2, then n=2 and polynomial will be


Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12

If the degree of polynomial is 3, then n=3 and polynomial will be


Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12 + 𝛽3 ∗ 𝑥13

If the degree of polynomial is N, then n=N and polynomial will be


Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12 + 𝛽3 ∗ 𝑥13 + . . . 𝛽𝑁 ∗ 𝑥1𝑁
Level Basic Salary(in Rs)
1 18000
2 19900
3 21700
4 25500
5 29200
6 35400
7 44900
8 47600
9 53100
10 56100
11 67700
12 78800
13 118500
14 131100
15 144200
16 182200
17 205400
18 225000
19 250000
Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12
Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12 + 𝛽3 ∗ 𝑥13
Y=𝛽0 + 𝛽1 ∗ 𝑥1 + 𝛽2 ∗ 𝑥12 + 𝛽3 ∗ 𝑥13 + 𝛽4 ∗ 𝑥14
Implementation of Polynomial Regression
Step 1: Importing the required libraries
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
Implementation of Polynomial Regression
Step2: Load the dataset into dataframe
dataset = pd.read_csv('7thPayMatrix.csv')
print(dataset)
X=dataset.iloc[:,0]
Y=dataset.iloc[:,1:2]
print(X)
print(Y)
plt.scatter(X,Y)
plt.xlabel('Level')
plt.ylabel('Salary')
plt.show()
Polynomial Linear Regression

Step 3: Pre-process the independent feature using a particular degree polynomial

from sklearn.preprocessing import PolynomialFeatures

Obj = PolynomialFeatures(degree = 2)

X_transformed = Obj.fit_transform(X)
Polynomial Linear Regression Fitting

Step 4: Pre-process the independent feature using a particular degree polynomial

from sklearn.linear_model import LinearRegression

model = LinearRegression()

model.fit(X_transformed, Y)

Y1= model.predict(X_transformed)
Polynomial Linear Regression Fitting

Step 5: Visualizing the fitted polynomial

plt.scatter(X,Y, color='red')

plt.plot(X,Y1, color='blue')

plt.xlabel('Level')

plt.ylabel('Salary')

plt.show()
Polynomial Linear Regression Fitting

Step 6: Analysis of the Errors


Polynomial Linear Regression Fitting

Step 6: Analysis of the Errors

from sklearn import metrics

print(metrics.mean_absolute_error(Y, Y1))

print(metrics.mean_squared_error(Y, Y1))

print(np.sqrt(metrics.mean_squared_error(Y, Y1)))
Thanks

You might also like