1 Linear Regression With One Variable
1 Linear Regression With One Variable
Model
representa6on
Machine
Learning
Andrew Ng
Regression
Problem
Andrew Ng
Nota6on:
m
=
Number
of
training
examples
xs
=
input
variable
/
features
ys
=
output
variable
/
target
variable
Andrew Ng
Training Set
How do we represent h ?
Andrew Ng
Cost
func6on
Machine
Learning
Andrew Ng
Training Set
Andrew Ng
3 2 1 0 0 1 2 3
3 2 1 0 0 1 2 3
3 2 1 0 0 1 2 3
Andrew Ng
Andrew Ng
Andrew Ng
Simplied
Goal:
Andrew Ng
2 1 0 0 1
Andrew Ng
2 1 0 0 1
Andrew Ng
2 1 0 0 1
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Machine Learning
Gradient descent
Andrew Ng
Have some func6on Want Outline: Start with some Keep changing to reduce un6l we hopefully end up at a minimum
Andrew Ng
J(0,1)
1
0
Andrew Ng
J(0,1)
1
0
Andrew Ng
Incorrect:
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
If is too large, gradient descent can overshoot the minimum. It may fail to converge, or even diverge.
Andrew Ng
Andrew Ng
Gradient descent can converge to a local minimum, even with the learning rate xed.
As we approach a local minimum, gradient descent will automa6cally take smaller steps. So, no need to decrease over 6me.
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
J(0,1)
1
0
Andrew Ng
J(0,1)
1
0
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Andrew Ng
Batch: Each step of gradient descent uses all the training examples.
Andrew Ng