ML Classification
ML Classification
Above,
P(c|x) is the posterior probability of class (c, target)
given predictor (x, attributes).
P(c) is the prior probability of class.
P(x|c) is the likelihood which is the probability of predictor given class.
P(x) is the prior probability of predictor.
How Naive Bayes algorithm works?
• Step 1: Convert the data set into a frequency
table
• Step 2: Create Likelihood table by finding the
probabilities
• Step 3: Now, use Naive Bayesian equation to
calculate the posterior probability for each
class.
Players will play if weather is sunny. Is this statement is
correct?
implementation
import numpy as np
X = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])
Y = np.array([1, 1, 1, 2, 2, 2])
from sklearn.naive_bayes import GaussianNB
clf = GaussianNB()
GaussianNB(priors=None)
clf.fit(X, Y)
print(clf.predict([[1, 1]]))
print(clf.predict([[-1, -1]]))
y=clf.predict(X)
print('misclassified samples: %d'%(y!=Y).sum())#compute
from sklearn.metrics import accuracy_score
print('Accuracy:%.2f'%accuracy_score(y,Y))
• BEYOND SYLLABUS
Example for impurity
Example for information gain
import matplotlib.pyplot as plt
import numpy as np
def gini(p):
return (p)*(1 - (p)) + (1 - p)*(1 - (1-p))
def entropy(p):
return - p*np.log2(p) - (1 - p)*np.log2((1 - p))
def error(p):
return 1 - np.max([p, 1 - p])
x = np.arange(0.0, 1.0, 0.01)
ent = [entropy(p) if p != 0 else None for p in x]
err = [error(i) for i in x]
fig = plt.figure()
ax = plt.subplot(111)
for i, lab, ls, c, in zip([ent,gini(x), err],
['Entropy','Gini Impurity','Misclass Error'],
['-','--', '-.'],
['black','red', 'green']):
line = ax.plot(x, i, label=lab,
linestyle=ls, lw=2, color=c)
ax.legend()
ax.axhline(y=0.5, linewidth=1, color='k',
linestyle='--')
ax.axhline(y=1.0, linewidth=1, color='k',
linestyle='--')
plt.ylim([0, 1.1])
plt.xlabel('p(i=1)')
plt.ylabel('Impurity Index')
plt.show()
from sklearn.tree import export_graphviz
export_graphviz(tree,
out_file='tree.dot',
feature_names=['petal length', 'petal width'])
https://www.coolutils.com/online/DOT-to-P
NG
!! !
ou
k Y
a n
T h