No Cover Image

Journal article 1057 views 105 downloads

Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations

Ankush Aggarwal Orcid Logo, Sanjay Pant Orcid Logo

Algorithms, Volume: 13, Issue: 4, Start page: 78

Swansea University Authors: Ankush Aggarwal Orcid Logo, Sanjay Pant Orcid Logo

  • 53886.pdf

    PDF | Version of Record

    This is an open access article distributed under the Creative Commons Attribution License (CC-BY).

    Download (2.43MB)

Check full text

DOI (Published version): 10.3390/a13040078

Abstract

Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton’s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we...

Full description

Published in: Algorithms
ISSN: 1999-4893
Published: MDPI AG 2020
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa53886
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton’s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we seek to enlarge the basin of attraction of the classical Newton’s method. The key idea is to develop a relatively simple multiplicative transform of the original equations, which leads to a reduction in nonlinearity, thereby alleviating the limitation of Newton’s method. Based on this idea, we derive a new class of iterative methods and rediscover Halley’s method as the limit case. We present the application of these methods to several mathematical functions (real, complex, and vector equations). Across all examples, our numerical experiments suggest that the new methods converge for a significantly wider range of initial guesses. For scalar equations, the increase in computational cost per iteration is minimal. For vector functions, more extensive analysis is needed to compare the increase in cost per iteration and the improvement in convergence of specific problems
Keywords: Newton–Raphson method; Newton’s iteration; nonlinear equations; iterative solution; gradient-based methods
College: Faculty of Science and Engineering
Issue: 4
Start Page: 78