Open In App

Root Finding Algorithm

Last Updated : 20 Feb, 2025
Comments
Improve
Suggest changes
Like Article
Like
Report

Root-finding algorithms are tools used in mathematics and computer science to locate the solutions, or "roots," of equations. These algorithms help us find solutions to equations where the function equals zero. For example, if we have an equation like f(x) = 0, a root-finding algorithm will help us determine the value of x that makes this equation true.

Different types of root finding algorithms are bisection method, Regula-Falsi method, Newton-Raphson method, and secant method. These algorithms are essential in various fields of science and engineering because they help solve equations that cannot be easily rearranged or solved analytically.

Root-Finding-Algorithm

Types of Root Finding Algorithms

Root-finding algorithms can be broadly categorized into Bracketing Methods and Open Methods.

  • Bracketing Methods: This method starts with an interval where the function changes sign, ensuring that a root lies within this interval. These methods iteratively reduce the interval size to home in on the root.
  • Open Methods: This starts with one or more initial guesses that do not necessarily bracket the root. These methods can converge more quickly but do not always guarantee convergence.

Bracketing Methods

A bracketing method finds the root of a function by progressively narrowing down an interval that contains the root. It uses the intermediate value theorem, which states that if a continuous function changes signs over an interval, a root exists within that interval. Starting with such an interval, the method repeatedly reduces the interval size until it is small enough to identify the root.

For polynomials, additional techniques like Descartes' rule of signs, Budan's theorem, and Sturm's theorem can determine the number of roots in an interval, ensuring all real roots are found accurately.

The bracketing method is further classified into:

  • Bisection Method
  • False Position (Regula Falsi) Method

Bisection Method

Bisection method is one of the simplest and most reliable root finding algorithms. It works by repeatedly narrowing down an interval that contains the root. We can use the bisection method using following methods:

Step 1: Start with two points, a and b, such that f(a) and f(b) have opposite signs. This guarantees that there is at least one root between a and b.

Step 2: Calculate the midpoint, c, of the interval [a,b] using c = (a + b)/2.

Step 3: Determine the sign of f(c). If f(c) is close enough to zero (within a predefined tolerance), c is the root. Otherwise, replace a or b with c depending on the sign of f(c), ensuring that the new interval still brackets the root.

Step 4: Repeat the process until the interval is sufficiently small or f(c) is close enough to zero.

Here, number of iterations needed to achieve an ε-approximate root using the bisection method is given by:

\bold{N \approx \log_2 \left( \frac{b - a}{\varepsilon} \right)}

False Position (Regula Falsi) Method

False Position method, also known as the Regula Falsi method, is a numerical technique used to find the roots of a function, where the function equals zero. It is similar to the bisection method but often converges faster. The False Position method combines the concepts of the bisection method and the secant method, making it both simple and efficient for solving equations.

Here’s a step-by-step explanation of how it works:

Step 1: Start with two points, a and b, such that f(a) and f(b) have opposite signs. This guarantees that there is at least one root between a and b.

Step 2: Calculate the midpoint, c, of the interval [a,b] using c = a - [f(a).(b - a)]/[f(b) - f(a)].

Step 3: Evaluate f(c). If f(c) is close enough to zero (within a predefined tolerance), then c is the root.

Step 4: Depending on the sign of f(c), update the interval:

  • If f(a) and f(c) have opposite signs, set b = c.
  • If f(b) and f(c) have opposite signs, set a = c.

Step 5: Repeat the process until the interval is sufficiently small or f(c) is close enough to zero.

Read more about Regula Falsi method.

Open Methods

Open methods are root-finding algorithms that don't necessarily require an interval containing the root. They start with one or more initial guesses and iteratively refine them until a root is found. These methods are generally faster but may not always converge.

In this section we will further learn about the classification of open method, that are:

  • Newton- Raphson Method
  • Secant Method

Newton-Raphson Method

Newton-Raphson method is an iterative algorithm that uses the derivative of the function to find the root. It’s faster than the bisection method but requires a good initial guess and the calculation of derivatives. Procedure is given as below:

Step 1: Start with an initial guess x0.

Step 2: Use the formula, x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} to find the next approximation, where f'(xn) is the derivative of f(x) at xn.

Step 3: Repeat the iteration until the change between xn and xn+1​ is smaller than a predefined tolerance.

Note: Newton-Raphson method converges quickly when the initial guess is close to the root, but it can fail if f′(x) is zero or if the function is not well-behaved near the root.

Secant Method

Secant method is similar to the Newton-Raphson method but does not require the calculation of derivatives. Instead, it uses a secant line to approximate the root. Procedure of secant method is given as:

Step 1: Start with two initial guesses x0​ and x1​.

Step 2: Use the formula x_{n+1} = x_n - f(x_n) \frac{x_n - x_{n-1}}{f(x_n) - f(x_{n-1})} to find the next approximation.

Step 3: Repeat the iteration until the change between xn and xn+1​ is smaller than a predefined tolerance.

Secant method can be faster than the bisection method and does not require the derivative of the function, but it can be less reliable than the Newton-Raphson method, especially if the initial points are not well chosen.

Comparison of Root Finding Methods

The comparison between the root finding methods are being showed below, on the basis of advantages and disadvantages.

Method

Description

Advantage

Disadvantage

Bisection Method

It divides interval in half, and guarantees convergence

Simple and faster method

Slow convergence

False Position Method

It uses linear interpolation, faster than bisection

It maintains bracketing, faster than bisection

It may fail due to roundoff errors

Newton's Method

It uses function and derivative, fast convergence

It is a quadratic convergence, works in higher dimensions

It may not converge if initial guess is far

Secant Method

It is a derivative-free variant of Newton's, simpler

It doesn't require derivative, faster than bisection

Slower convergence (order ~1.6)

Comparison of Root Finding Methods with Example

Solving the equation f(x)= x3−4x−9= 0 using Bisection Method, Regula-Falsi Method, Newton-Raphson Method, and Secant Method with 10 iterations. The computed root approximations are displayed in the table.

MethodRoot Approximation
Bisection Method2.7060546875
Regula-Falsi Method2.7065276119801087
Newton-Raphson Method2.7065279765747587
Secant Method2.7065278974619447

Comparison of Methods:

  • Bisection Method: Converged slowly, reaching 2.706055 after 10 iterations.
  • Regula-Falsi Method: Improved on Bisection, reaching 2.706528 faster.
  • Newton-Raphson Method: Fastest convergence, achieving 2.706528.
  • Secant Method: Similar to Newton-Raphson, reaching 2.706528.

Observations:

  • Newton-Raphson and Secant Methods provided the fastest and most accurate results.
  • Bisection was the slowest, but it ensures convergence.
  • Regula-Falsi performed better than Bisection but was slower than the derivative-based methods.

This comparison highlights that if derivatives are available, Newton-Raphson is preferred. If derivatives are difficult to compute, Secant Method is a good alternative.

How to Choose a Root Finding Algorithm?

Choosing a root finding algorithm depends on several factors:

  • Function Properties: Consider whether the function is continuous, differentiable, and how well-behaved it is.
  • Initial Knowledge: Determine if you have an initial interval containing the root or just a rough estimate.
  • Accuracy Requirements: Assess how accurate the root approximation needs to be.
  • Computational Resources: Consider the computational complexity and resources available.
  • Robustness: Evaluate how robust the algorithm is against different function behaviors and initial guesses.
  • Speed: Balance between convergence speed and computational efficiency.
  • Dimensionality: For higher-dimensional problems, choose algorithms that extend well to multiple dimensions.

Applications of Root Finding Algorithms

The various applications of root-finding algorithms are:

  • Numerical Analysis: It is important in numerical analysis for solving nonlinear equations, which commonly arise in mathematical modeling and simulation.
  • Optimization: Form an integral part of optimization algorithms for minimizing or maximizing functions by finding their critical points.
  • Finance: It is used in financial modeling and risk management for pricing options, forecasting, and analyzing financial derivatives.
  • Image Processing: It is used in image processing algorithms, such as edge detection and image segmentation, for solving nonlinear equations.

Read More,


Next Article

Similar Reads