Unit 2
Unit 2
y = mx + c
Slope and intercept is given by the definition of least square curve fitting :
P P P
n x.y − ( x).( y)
Slope(m) = P P
n x2 − ( x)2
Intercept(c) = ȳ − m · x̄
The best-fit line is then given by y = mx + c, where m and c are the values obtained from solving the normal
equations.
xy = x*y
xx = x*x
n= len(x)
sxy = sum(xy)
sxx = sum(xx)
sx = sum(x)
sy = sum(y)
Figure 1: Linear Function
# m = n.sum(x.y) - sum(x).sum(y)/ {n.sum(x^2) - sum(x).sum(x)}
c = (sy - m*sx)/n
z=[]
for i in x :
z.append(m*i+c)
• Take the logarithm (loge ) of both sides of the equation: loge (y) =
loge (axb ).
• Apply the properties of logarithms: loge (y) = loge (a) + b.loge (x)
• Now fit the curve same as linear regression Y = MX+C M and C are
slope and intercept
Y = loge (y), X = loge (x), M = b, C = loge (a)
X = np.log(x)
Y= np.log(y)
X_mean = np.mean(X)
Y_mean = np.mean(Y)
n=len(x) Figure 2: Linear Function
z=[]
for i in x:
z.append((a*i)**b)
xx = np.linspace(1,6,100)
yfit = a*np.power(xx,b)
print("Data points : " )
for i , j in zip(x,y):
print(f'({i},{j})', end = '')
print()
print("Fitted points : ")
for i , j in zip(x,z):
print(f'({i},{j})')
• Apply the properties of logarithms: loge (y) = loge (a) + b.loge (x)
• Now fit the curve same as linear regression Y = MX+C {M and C are
slope and intercept}