ML Unit 5
ML Unit 5
► Step-1
► In the first step first, multiply all input values with corresponding weight values and then add them to
determine the weighted sum. Mathematically, we can calculate the weighted sum as follows:
∑ wi*xi = x1*w1 + x2*w2 +…wn*xn
► Add a special term called bias 'b' to this weighted sum to improve the model's performance.
∑ wi*xi + b
► Step-2
► an activation function is applied with the above-mentioned weighted sum, which gives us output either in
binary form or a continuous value as follows:
Y = f ( ∑wi*xi + b)
Types of Perceptron Models
► Based on the layers, Perceptron models are divided into two types. These are as follows:
1. Single-layer Perceptron Model
2. Multi-layer Perceptron model
•“b” = bias (an element that adjusts the boundary away from origin without any dependence on the input value)
5.Travel back from the output layer to the hidden layer to adjust the
weights such that the error is decreased.
Why We Need Backpropagation?