0% found this document useful (0 votes)
369 views

Open Method Examples

This chapter discusses open methods for estimating roots of functions, including fixed-point iteration and Newton-Raphson methods. Fixed-point iteration involves rewriting the function as x=g(x) and iteratively calculating new values of x. Newton-Raphson approximates the tangent line to graphically estimate where it crosses the x-axis. The secant method approximates the derivative using finite differences to avoid explicitly calculating derivatives.

Uploaded by

Muhammad Sya Fiq
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
369 views

Open Method Examples

This chapter discusses open methods for estimating roots of functions, including fixed-point iteration and Newton-Raphson methods. Fixed-point iteration involves rewriting the function as x=g(x) and iteratively calculating new values of x. Newton-Raphson approximates the tangent line to graphically estimate where it crosses the x-axis. The secant method approximates the derivative using finite differences to avoid explicitly calculating derivatives.

Uploaded by

Muhammad Sya Fiq
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Chapter 6

Roots: Open Methods


Chapter Objectives
Recognizing the difference between bracketing and open
methods for root location.
Understanding the fixed-point iteration method and how you
can evaluate its convergence characteristics.
Knowing how to solve a roots problem with the Newton-
Raphson method and appreciating the concept of quadratic
convergence.
Knowing how to implement both the secant and the modified
secant methods.
Knowing how to use MATLAB’s fzero function to estimate
roots.
Open Methods
Open methods differ from bracketing
methods, in that open methods require only
a single starting value or two starting values
that do not necessarily bracket a root.
Open methods may diverge as the
computation progresses, but when they do
converge, they usually do so much faster
than bracketing methods.
Graphical Comparison of Methods

a) Bracketing method
b) Diverging open method
c) Converging open method
Simple Fixed-Point Iteration
Rearrange the function f(x)=0 so that x is on
the left-hand side of the equation: x=g(x)
Use the new function g to predict a new
value of x - that is, xi+1=g(xi)
The approximate error is given by:
𝑥𝑖+1 − 𝑥𝑖
𝜀𝑎 = 100%
𝑥𝑖+1
Example
Solve f(x)=e-x-x
Re-write as x = g(x) by isolating x
(example: x = e-x)
Start with an initial guess (here, 0)
i xi |a| % |t| % |t |i/|t |i-1
0 0.0000 100.000
1 1.0000 100.000 76.322 0.763
2 0.3679 171.828 35.135 0.460
3 0.6922 46.854 22.050 0.628
4 0.5005 38.309 11.755 0.533

Continue until some tolerance is reached


Convergence
Convergence of the
simple fixed-point
iteration method.
a) Convergent
b) Convergent
c) Divergent
d) Divergent
Newton-Raphson Method
Based on forming the tangent line to the f(x)
curve at some guess x, then following the
tangent line to where it crosses the x-axis.


𝑓 𝑥𝑖 − 0
𝑓 𝑥𝑖 =
𝑥𝑖 − 𝑥𝑖+1
𝑓(𝑥𝑖)
𝑥𝑖+1 = 𝑥𝑖 −
𝑓′(𝑥𝑖)
Pros and Cons
Pro: The error of the i+1th
iteration is roughly
proportional to the square
of the error of the ith
iteration - this is called
quadratic convergence

Con: Some functions show


slow or poor convergence
or divergence!
Secant Methods, 1
A potential problem in implementing the
Newton-Raphson method is the evaluation of
the derivative - there are certain functions
whose derivatives may be difficult or
inconvenient to evaluate.
For these cases, the derivative can be
approximated by a backward finite divided
difference:
𝑓 𝑥𝑖 − 1 − 𝑓(𝑥𝑖)
𝑓′(𝑥𝑖) ≅
𝑥𝑖 − 1 − 𝑥𝑖
Secant Methods, 2
(Modified Secant Method)
Substitution of this approximation for the
derivative to the Newton-Raphson method
equation gives:
𝑓(𝑥𝑖)(𝑥𝑖 − 1 − 𝑥𝑖)
𝑥𝑖 + 1 = 𝑥𝑖 −
𝑓 𝑥𝑖 − 1 − 𝑓(𝑥𝑖)
Note - this method requires two initial
estimates of x but does not require an
analytical expression of the derivative.

You might also like