Newton-Raphson Method
The Newton-Raphson method is a rapidly-converging method to approximate roots of a smooth function. In this post we show an implementation that uses automatic differentiation to obtain the function derivative.
Newton-Raphson Method
The Newton-Raphson method (sometimes refered as simply Newton’s method) is a rootfinding algorithm for one-dimensional functions. A number of conditions must be met in order to be able to use it effectively. In particular, both the function and its first derivative must be available.
The iterative formula is derived as follows. At each point , a linear approximation to the function is produced:
Equating this approximation to zero and solving for x gives the new approximation :
function NewtonRaphson(f, fp, x0, tol=1e-8, maxIter = 1000)
x = x0
fx = f(x0)
iter = 0
while abs(fx) > tol && iter < maxIter
x = x - fx/fp(x) # Iteration
fx = f(x) # Precompute f(x)
iter += 1
end
return x
end
Example with explicit derivatives
Now, let’s now implement the method to solve the following equation
f(x) = x^x - 100
fp(x) = x^x * (log(x) + 1)
x = NewtonRaphson(f, fp, 1)
print(x)
3.597285023540418
Now, even with this simple equation, computing the derivative is a manual, time-consuming, and error-prone process. We would like to remove it. There are at least three alternatives to do so:
- Use a different algorithm which does not require derivatives, like the secant method.
- Use a numerical approximation for the derivative.
- Make the computer find the symbolic derivative.
The last alternative looks interesting, specially in the context of Julia, which has great tools for automatic differentiation.
Example with automatic differentiation
Automatic differentiation is a very interesting tool. While there are several packages within the Julia ecosystem that implement this functionality, ForwardDiff
is one of the most straightforward and more than sufficient for our needs.
For example, to produce a derivative function, we can rely on the ForwardDiff.derivative
routine. While this routine has been thoroughly tested, it’s considered good practice to run our own tests:
using ForwardDiff
autodiff(f) = x -> ForwardDiff.derivative(f, x)
# Test with our known function
abs(autodiff(f)(3) - fp(3))
Now that we can differentiate functions automatically, we can extend our NewtonRaphson
function with the definition:
NewtonRaphson(f, x0, tol=1e-8, maxIter=1e3) = NewtonRaphson(f, autodiff(f), x0, tol, maxIter)
Note that this time, NewtonRaphson
doesn’t explicitly depend on the derivative function. Now we can do the following:
x = NewtonRaphson(f, 1.0)
3.597285023540418
Conclusion
The Newton-Raphson is one of the classical zero-finding algorithms. It is straightforward to implement in Julia, and in combination with automatic differentiation provides a very useful tool to solve simple non-linear equations.
I will continue to post tutorials like this one, to discuss numerical algorithms and their implementation in the Julia programming language. If you don’t want to miss an update, suscribe to the mailing list. It’s only one e-mail per month!