Jacques Gaudin Jacques Gaudin - 7 months ago 27
Python Question

Scipy root-finding method

I am writing a class that has an mathematical function as an attribute, say f.

f is:


  • Defined on a real segment [-w;+w]

  • Positive and bounded above by a real H

  • even (for all x in [-w;+w], f(x)=f(-x)) and f(w)=f(-w)=0

  • Differentiable over [-w;+w] and its derivative is positive and continuous over [-w;0]



My class looks like :

from scipy.misc import derivative
from scipy.integrate import quad
from math import cosh, sqrt

class Function(object):

w = 1.
PRECISION = 1e-6

def f(self, x):
'''This is an example but f could be
any math function matching requirments above.
'''
return 0.5+1.07432*(1-cosh(x/1.07432))

def deriv_f(self, x):
return derivative(self.f, x, self.PRECISION)

def x_to_arc_length(self, x):
def func(x):
return sqrt(1+self.deriv_f(x)**2)
return quad(func, -self.w, x)[0]

def arc_length_to_x(self, L):
bound = [-self.w, self.w]
while bound[1]-bound[0] > self.PRECISION:
mid= sum(bound)/2
bound[(self.x_to_arc_length(mid)-L > 0)] = mid
return sum(bound)/2


I use bisection to inverse the arc length method, but I was looking at changing this for one of the
scipy.optimize
root-finding method to gain speed.
I am new to scipy and must admit that my math are a bit rusted...
Scipy gives me the choice between
brentq
,
brenh
,
ridder
,
bisect
and
newton
.

Could anyone point me to the best-suited method for a case like this ? Or maybe there is a better library for this ?

Answer

I'm not an expert at Python, but I know from Numerical Analysis that among the methods you listed (Brent, bisection, Ridder's method and Newton-Raphson), Brent's method is usually preferred for generic real scalar functions f of a single real variable x. As you can read here, if f is continuous and the method is applied to an interval [a,b] with f(a)f(b)<0, then Brent's method enjoys guaranteed convergence to a zero, like the bisection method. For many well-behaved functions, Brent's method converges much faster than bisection , but in some unlucky cases it can require N^2 iterations, where N is the number of iterations of bisection to achieve a given tolerance.

Newton's method, on the other hand, usually converges faster than Brent's, when it converges, but there are cases where it doesn't converge at all. For the same function, Newton's method may or may not converge, depending also on the distance between the starting point and the root. Thus, it's riskier to use in a general purpose code.

Regarding the choice between brentq and brenth, it looks like they should be pretty similar, with the first one more heavily tested. Thus you could go for brentq, or, if you have some time, do a little benchmarking between them.