No Cover Image

Journal article 1058 views 105 downloads

Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations

Ankush Aggarwal Orcid Logo, Sanjay Pant Orcid Logo

Algorithms, Volume: 13, Issue: 4, Start page: 78

Swansea University Authors: Ankush Aggarwal Orcid Logo, Sanjay Pant Orcid Logo

  • 53886.pdf

    PDF | Version of Record

    This is an open access article distributed under the Creative Commons Attribution License (CC-BY).

    Download (2.43MB)

Check full text

DOI (Published version): 10.3390/a13040078

Abstract

Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton’s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we...

Full description

Published in: Algorithms
ISSN: 1999-4893
Published: MDPI AG 2020
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa53886
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2020-03-30T20:17:47Z
last_indexed 2020-10-23T03:06:17Z
id cronfa53886
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2020-10-22T13:35:11.5296806</datestamp><bib-version>v2</bib-version><id>53886</id><entry>2020-03-30</entry><title>Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations</title><swanseaauthors><author><sid>33985d0c2586398180c197dc170d7d19</sid><ORCID>0000-0002-1755-8807</ORCID><firstname>Ankush</firstname><surname>Aggarwal</surname><name>Ankush Aggarwal</name><active>true</active><ethesisStudent>false</ethesisStudent></author><author><sid>43b388e955511a9d1b86b863c2018a9f</sid><ORCID>0000-0002-2081-308X</ORCID><firstname>Sanjay</firstname><surname>Pant</surname><name>Sanjay Pant</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2020-03-30</date><deptcode>EEN</deptcode><abstract>Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton&#x2019;s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we seek to enlarge the basin of attraction of the classical Newton&#x2019;s method. The key idea is to develop a relatively simple multiplicative transform of the original equations, which leads to a reduction in nonlinearity, thereby alleviating the limitation of Newton&#x2019;s method. Based on this idea, we derive a new class of iterative methods and rediscover Halley&#x2019;s method as the limit case. We present the application of these methods to several mathematical functions (real, complex, and vector equations). Across all examples, our numerical experiments suggest that the new methods converge for a significantly wider range of initial guesses. For scalar equations, the increase in computational cost per iteration is minimal. For vector functions, more extensive analysis is needed to compare the increase in cost per iteration and the improvement in convergence of specific problems</abstract><type>Journal Article</type><journal>Algorithms</journal><volume>13</volume><journalNumber>4</journalNumber><paginationStart>78</paginationStart><publisher>MDPI AG</publisher><issnElectronic>1999-4893</issnElectronic><keywords>Newton&#x2013;Raphson method; Newton&#x2019;s iteration; nonlinear equations; iterative solution; gradient-based methods</keywords><publishedDay>29</publishedDay><publishedMonth>3</publishedMonth><publishedYear>2020</publishedYear><publishedDate>2020-03-29</publishedDate><doi>10.3390/a13040078</doi><url/><notes/><college>COLLEGE NANME</college><department>Engineering</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>EEN</DepartmentCode><institution>Swansea University</institution><apcterm/><lastEdited>2020-10-22T13:35:11.5296806</lastEdited><Created>2020-03-30T15:15:14.8168681</Created><path><level id="1">Faculty of Science and Engineering</level><level id="2">School of Engineering and Applied Sciences - Uncategorised</level></path><authors><author><firstname>Ankush</firstname><surname>Aggarwal</surname><orcid>0000-0002-1755-8807</orcid><order>1</order></author><author><firstname>Sanjay</firstname><surname>Pant</surname><orcid>0000-0002-2081-308X</orcid><order>2</order></author></authors><documents><document><filename>53886__16975__27535594021e475693b54921ab028dc6.pdf</filename><originalFilename>53886.pdf</originalFilename><uploaded>2020-03-30T15:18:19.8303471</uploaded><type>Output</type><contentLength>2544397</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><documentNotes>This is an open access article distributed under the Creative Commons Attribution License (CC-BY).</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>http://creativecommons.org/licenses/by/4.0/</licence></document></documents><OutputDurs/></rfc1807>
spelling 2020-10-22T13:35:11.5296806 v2 53886 2020-03-30 Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations 33985d0c2586398180c197dc170d7d19 0000-0002-1755-8807 Ankush Aggarwal Ankush Aggarwal true false 43b388e955511a9d1b86b863c2018a9f 0000-0002-2081-308X Sanjay Pant Sanjay Pant true false 2020-03-30 EEN Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton’s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we seek to enlarge the basin of attraction of the classical Newton’s method. The key idea is to develop a relatively simple multiplicative transform of the original equations, which leads to a reduction in nonlinearity, thereby alleviating the limitation of Newton’s method. Based on this idea, we derive a new class of iterative methods and rediscover Halley’s method as the limit case. We present the application of these methods to several mathematical functions (real, complex, and vector equations). Across all examples, our numerical experiments suggest that the new methods converge for a significantly wider range of initial guesses. For scalar equations, the increase in computational cost per iteration is minimal. For vector functions, more extensive analysis is needed to compare the increase in cost per iteration and the improvement in convergence of specific problems Journal Article Algorithms 13 4 78 MDPI AG 1999-4893 Newton–Raphson method; Newton’s iteration; nonlinear equations; iterative solution; gradient-based methods 29 3 2020 2020-03-29 10.3390/a13040078 COLLEGE NANME Engineering COLLEGE CODE EEN Swansea University 2020-10-22T13:35:11.5296806 2020-03-30T15:15:14.8168681 Faculty of Science and Engineering School of Engineering and Applied Sciences - Uncategorised Ankush Aggarwal 0000-0002-1755-8807 1 Sanjay Pant 0000-0002-2081-308X 2 53886__16975__27535594021e475693b54921ab028dc6.pdf 53886.pdf 2020-03-30T15:18:19.8303471 Output 2544397 application/pdf Version of Record true This is an open access article distributed under the Creative Commons Attribution License (CC-BY). true eng http://creativecommons.org/licenses/by/4.0/
title Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
spellingShingle Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
Ankush Aggarwal
Sanjay Pant
title_short Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
title_full Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
title_fullStr Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
title_full_unstemmed Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
title_sort Beyond Newton: A New Root-Finding Fixed-Point Iteration for Nonlinear Equations
author_id_str_mv 33985d0c2586398180c197dc170d7d19
43b388e955511a9d1b86b863c2018a9f
author_id_fullname_str_mv 33985d0c2586398180c197dc170d7d19_***_Ankush Aggarwal
43b388e955511a9d1b86b863c2018a9f_***_Sanjay Pant
author Ankush Aggarwal
Sanjay Pant
author2 Ankush Aggarwal
Sanjay Pant
format Journal article
container_title Algorithms
container_volume 13
container_issue 4
container_start_page 78
publishDate 2020
institution Swansea University
issn 1999-4893
doi_str_mv 10.3390/a13040078
publisher MDPI AG
college_str Faculty of Science and Engineering
hierarchytype
hierarchy_top_id facultyofscienceandengineering
hierarchy_top_title Faculty of Science and Engineering
hierarchy_parent_id facultyofscienceandengineering
hierarchy_parent_title Faculty of Science and Engineering
department_str School of Engineering and Applied Sciences - Uncategorised{{{_:::_}}}Faculty of Science and Engineering{{{_:::_}}}School of Engineering and Applied Sciences - Uncategorised
document_store_str 1
active_str 0
description Finding roots of equations is at the heart of most computational science. A well-known and widely used iterative algorithm is Newton’s method. However, its convergence depends heavily on the initial guess, with poor choices often leading to slow convergence or even divergence. In this short note, we seek to enlarge the basin of attraction of the classical Newton’s method. The key idea is to develop a relatively simple multiplicative transform of the original equations, which leads to a reduction in nonlinearity, thereby alleviating the limitation of Newton’s method. Based on this idea, we derive a new class of iterative methods and rediscover Halley’s method as the limit case. We present the application of these methods to several mathematical functions (real, complex, and vector equations). Across all examples, our numerical experiments suggest that the new methods converge for a significantly wider range of initial guesses. For scalar equations, the increase in computational cost per iteration is minimal. For vector functions, more extensive analysis is needed to compare the increase in cost per iteration and the improvement in convergence of specific problems
published_date 2020-03-29T04:07:06Z
_version_ 1763753513544843264
score 11.031177