Aberth Algorithm (Gauss-Seidel style) For Computing All The Real Roots Of Polynomial Of Any Order

by admin in , , on June 12, 2019

The Aberth method (Aberth–Ehrlich method), named after Oliver Aberth[1] and Louis W. Ehrlich,[2] is a root-finding algorithm developed in 1967 for simultaneous approximation of all the roots of a univariate polynomial. This method converges cubically, an improvement over the Durand–Kerner method, another algorithm for approximating all roots at once, which converges quadratically.[1][2](However, both algorithms converge linearly at multiple zeros.[3]) This method is used in MPSolve, which is the reference software for approximating all roots of a polynomial to an arbitrary precision.

Example On Using This Code

Input:

P = 6*x^5 + 11*x^4 - 33*x^3 - 33*x^2 + 11*x + 6; % Polynomial we interested to solve [change it for your case]
Pd = diff(P,x,1);             % Calculation of first order differential
n = 5;                        % Number of the roots (the order of the equation) [change it for your case]
err = 0.001; Err = 1;         % err is the accuracy, Err > err is error checking parameter

Output:

The Root X = [2.00984      15.2878    -0.157132    -0.206285     0.359635]
Max. Error of = 0, # Iterations = 5

Description

The Aberth iteration has been implemented in a Gauss-Seidel style, i.e., the updated components are immediately used inside the current iteration, according to the formula

Let p(x) = pn.x^n + p(n-1).x^(n-1) +…+ p1.x + p0 be a univariate polynomial of degree n with real or complex coefficients. Then there exist complex numbers z1*,z2*,…,zn*, the roots of p(x), that give the factorisation: p(x)=pn . (x – z1*) . (x – z2^*) … (x – zn*).

Although those numbers are unknown, upper and lower bounds for their absolute values are computable from the coefficients of the polynomial. Now one can pick n distinct numbers in the complex plane—randomly or evenly distributed—such that their absolute values are within the same bounds. (Also, if the zeros are symmetrical, the starting points must not be exactly symmetrical along the same axis, as this can prevent convergence.)[1] A set of such numbers is called an initial approximation of the set of roots of p(x). This approximation can be iteratively improved using the following procedure.

Let z1, …, zn contained in C  be the current approximations of the zeros of p(x). Then offset numbers w1, …, wn contained in C are computed as

where p'(z) is the polynomial derivative of p evaluated in the point z.

The next set of approximations of roots of p(x) is then z1 – w1, …, zn – wn. One can measure the quality of the current approximation by the values of the polynomial or by the size of the offsets.

Conceptually, this method uses an electrostatic analogy, modeling the approximated zeros as movable negative point charges, which converge toward the true zeros, represented by fixed positive point charges.[1] A direct application of Newton’s method to each approximated zero will often cause multiple starting points to incorrectly converge to the same root. The Aberth method avoids this by also modeling the repulsive effect the movable charges have on each other. In this way, when a movable charge has converged on a zero, their charges will cancel out, so that other movable charges are no longer attracted to that location, encouraging them to converge to other “unoccupied” zeros. (Stieltjes also modeled the positions of zeros of polynomials as solutions to electrostatic problems.)

Inside the formula of the Aberth method one can find elements of Newton’s method and the Durand–Kerner method. Details for an efficient implementation, esp. on the choice of good initial approximations, can be found in Bini (1996).[3]

The updates of the roots may be executed as a simultaneous Jacobi-like iteration where first all new approximations are computed from the old approximations or as a sequential Gauss–Seidel-like iteration that uses each new approximation from the time it is computed.

A very similar method is the Newton-Maehly method. It computes the zeros one after another, but instead of an explicit deflation it divides by the already acquired linear factors on the fly. The Aberth method is like the Newton-Maehly method for computing the last root while pretending you have already found the other ones.[4]

Derivation from Newton’s Method

The iteration formula is the univariate Newton iteration for the function

If the values z1, …, zn are already close to the roots of p(x), then the rational function F(x) is almost linear with a dominant root close to zk and poles at z1, …, z(k-1), z(k+1), …, zn that direct the Newton iteration away from the roots of p(x) that are close to them. That is, the corresponding basins of attraction get rather small, while the root close to  has a wide region of attraction.

The Newton step F(x) / F'(x) in the univariate case is the reciprocal value to the logarithmic derivative

Thus, the new approximation is computed as

which is the update formula of the Aberth–Ehrlich method.

Literature

  1. Aberth, Oliver (1973). “Iteration methods for finding all zeros of a polynomial simultaneously”. Math. Comp. Mathematics of Computation, Vol. 27, No. 122. 27 (122): 339 344. doi:10.2307/2005621. JSTOR 2005621. Because of the obvious analogy from electrostatics, this field may be called the field of a unit plus charge … To avoid this, we assign a unit minus charge at each sampling point. The idea here is that when a sampling point z, is near a simple zero, the field from the minus charge at z, should counteract that from the plus charge at the zero, preventing a second sampling point from converging to this zero.
  2. Ehrlich, Louis W. (1967). “A modified Newton method for polynomials”. Comm. ACM10 (2): 107–108. doi:10.1145/363067.363115.
  3. Bini, Dario Andrea (1996). “Numerical computation of polynomial zeros by means of Aberth’s method”Numerical Algorithms13 (2): 179–200. Bibcode:1996NuAlg..13..179B. doi:10.1007/BF02207694.
  4. Bauer, F.L.; Stoer, J. (1962). “Algorithm 105: Newton Maehly”. Comm. ACM5 (7): 387–388. doi:10.1145/368273.368423.

 

1 Sale

Share Now!

Release Information

  • Price
    :

    $5.99

  • Released
    :

    June 12, 2019

  • Last Updated
    :

    June 22, 2019

Share Your Valuable Opinions