JuliaNLSolvers / NLsolve.jl

Julia solvers for systems of nonlinear equations and mixed complementarity problems

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Change numerical diff backend to DiffEqDiffTools.jl

ChrisRackauckas opened this issue · comments

I have been talking about this with @pkofod, now it's ready. DiffEqDiffTools.jl is about a 30x-100x performance improvement over Calculus.jl, can natively handle complex numbers and abstract arrays, and has a fj! routine built in. Its interface is not "consumer-friendly" since it has a bunch of optional type controls, but it should have the ability to capture everything here.

Bringing @dextorious in on this

Okay, so we have working derivatives (over points or arrays, so this includes gradients as well) and Jacobians for real or complex callables. StridedArray inputs are decently well tested, more exotic AbstractArrays might still fail if they don't support indexing or have other unusual properties, that part is a work in progress. Direct calculation of Hessians is not supported yet, that's my next priority.

The basic syntax is:
DiffEqDiffTools.finite_difference!(df, f, x, fdtype, funtype, Val{:Default}, fx, epsilon, returntype),
where df is a preallocated array for the derivative/gradient, f is the callable, x is the point(s) where you want to differentiate, fdtype is Val{:forward}/Val{:central},Val{:complex} (the last one only for real-valued callables), funtype is either Val{:Real} or Val{:Complex}, fx contains the function values at the points given by x and epsilon is a real-valued array of size length(x) for temporaries. Everything past fdtype is optional, with sensible defaults. There's also finite_difference_jacobian! with basically the same syntax and non-mutating versions without the exclamation mark which will allocate the necessary storage on their own.

This API should remain stable for the foreseeable future, ugly as it is, and it's easy to hide behind convenience wrappers for specific usecases. Some of the implementations will still be improved behind the scenes (others are already close to optimal), but without affecting the API and all of them should already be considerably faster than existing alternatives such as Calculus.jl.

Do note, however, that while there is decent direct test coverage at least for StridedArrays and OrdinaryDiffEq.jl already uses this (thereby providing lots of indirect tests on top), the package is fairly new and I'm sure there's still a bug or two hiding somewhere. Let me know if you find any, have any questions or want some features that aren't already there.

Cool, will have a look

Is fx meant to be a variable that is mutated? I mean, is this what I'm supposed to use to pass our F(x) cache?

fj! accepts an x and a cache for F(x) and J(x). I think I get where J(x) (df) and x go, but I'm not sure about F(x)

In the current implementation fx (if passed in at all) is supposed to contain pre-evaluated function values f(x). The appropriate derivatives are then computed with a smaller number of function evaluations where possible (i.e. for forward differences). DiffEqDiffTools will not mutate the contents of fx, so you can continue using those values elsewhere.

For fj! we don't have a separate function. If you simply pass two caches into the Jacobian function (J and df), it will give you the derivatives df in addition to the Jacobian. If you only pass in J, you just get the Jacobian.