Solving non-linear systems

To find zeros of a non-linear function of multiple variables, we can rely on the NLsolve package.

Let’s consider as a simple example, the solution of the following system:

$$ \left\lbrace \begin{array}{c} x^2 + 2y^2 = 1 \\ 2x^2 + y^2 = 1 \end{array} \right. $$

Interestingly, this package already includes automatic differentiation, so there is no need to implement gradients explicitly.

using NLsolve

function f!(F, x)
    F[1] = x[1]^2 + 2x[2]^2 - 1
    F[2] = 2x[1]^2 + x[2]^2 - 1

x = nlsolve(f!, [ 0.1; 1.2], autodiff = :forward)
2-element Vector{Float64}:

Nonlinear Optimization

To find extrema of multidimensional functions, we can rely on the Optimization.jl meta-package, which wraps many optimzation packages into a unified interface.

We will show the absolute basics here. See the basic usage documentation of that package are straightforward for an extended tutorial.

We will search for the minima of the Rosenbrock function

$$ f(x,y) = (a-x)^2 + b(y-x^2)^2 $$

which has a global minima at $ a, a^2 $, where $ f(x,y) = 0 $.

We will test with the values $ a=1, b=100 $, so we expect the solution to be $ (1,1) $.


As an example of a gradient-free method we might consider the Nelder–Mead algorithm (see Wikipedia) in which, for example, Matlab’s fminsearch is based.

using Optimization, OptimizationOptimJL

rosenbrock(u,p) =  (p[1] - u[1])^2 + p[2] * (u[2] - u[1]^2)^2
u0 = zeros(2)
p  = [1.0,100.0]

prob = OptimizationProblem(rosenbrock,u0,p)
sol = solve(prob,NelderMead())
u: 2-element Vector{Float64}:


To use gradient information, we can rely on, for example, the popular BFGS algorithm (see Wiki). The gradients themselves can be obtained via automatic differentiation, as follows:

using ForwardDiff
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(optf, u0, p)
sol = solve(prob,BFGS())
u: 2-element Vector{Float64}: