Newton-Raphson Method

by Martin D. Maas, Ph.D

The Newton-Raphson method is a rapidly-converging method to approximate roots of a smooth function. In this post we show an implementation that uses automatic differentiation to obtain the function derivative.

Newton-Raphson Method

The Newton-Raphson method (sometimes refered as simply Newton’s method) is a rootfinding algorithm for one-dimensional functions. A number of conditions must be met in order to be able to use it effectively. In particular, both the function and its first derivative must be available.

The iterative formula is derived as follows. At each point xnx_n, a linear approximation to the function ff is produced:

f(x)f(xn)+f(xn)(xxn)\begin{equation} f(x) \approx f(x_n) + f^{\prime}(x_n)(x-x_n) \end{equation}

Equating this approximation to zero and solving for x gives the new approximation xn+1x_{n+1}:

xn+1=xnf(xn)f(xn)\begin{equation} x_{n+1} = x_n - \frac{f(x_n)}{f^{\prime}(x_n)} \end{equation}
function NewtonRaphson(f, fp, x0, tol=1e-8, maxIter = 1000)
    x = x0
    fx = f(x0)
    iter = 0
    while abs(fx) > tol && iter < maxIter
        x = x  - fx/fp(x)   # Iteration
        fx = f(x)           # Precompute f(x)
        iter += 1
    return x

Example with explicit derivatives

Now, let’s now implement the method to solve the following equation

xx=100\begin{equation} x^x = 100 \end{equation}
f(x) = x^x - 100
fp(x) = x^x * (log(x) + 1)

x = NewtonRaphson(f, fp, 1)

Now, even with this simple equation, computing the derivative is a manual, time-consuming, and error-prone process. We would like to remove it. There are at least three alternatives to do so:

  • Use a different algorithm which does not require derivatives, like the secant method.
  • Use a numerical approximation for the derivative.
  • Make the computer find the symbolic derivative.

The last alternative looks interesting, specially in the context of Julia, which has great tools for automatic differentiation.

Example with automatic differentiation

Automatic differentiation is a very interesting tool. While there are several packages within the Julia ecosystem that implement this functionality, ForwardDiff is one of the most straightforward and more than sufficient for our needs.

For example, to produce a derivative function, we can rely on the ForwardDiff.derivative routine. While this routine has been thoroughly tested, it’s considered good practice to run our own tests:

using ForwardDiff
autodiff(f) = x -> ForwardDiff.derivative(f, x)

# Test with our known function
abs(autodiff(f)(3) - fp(3))

Now that we can differentiate functions automatically, we can extend our NewtonRaphson function with the definition:

NewtonRaphson(f, x0, tol=1e-8, maxIter=1e3) = NewtonRaphson(f, autodiff(f), x0, tol, maxIter)

Note that this time, NewtonRaphson doesn’t explicitly depend on the derivative function. Now we can do the following:

x = NewtonRaphson(f, 1.0)


The Newton-Raphson is one of the classical zero-finding algorithms. It is straightforward to implement in Julia, and in combination with automatic differentiation provides a very useful tool to solve simple non-linear equations.

I will continue to post tutorials like this one, to discuss numerical algorithms and their implementation in the Julia programming language. If you don’t want to miss an update, suscribe to the mailing list. It’s only one e-mail per month!