JuliaNLSolvers / Optim.jl

Optimization functions for Julia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Hessian being called separately using only_fgh!()

acoh64 opened this issue · comments

Hi,

I defined a function that computes f, g!, and h! together, as shown in the documentation. However, when I use this with only_fgh! to run NewtonTrustRegion(), the optimization algorithm computes the hessian separately from f and g!. My f, g!, and h! all use the results from an expensive calculation so it would greatly speed up the optimization if all three were computed at the same time.

Here is a MWE where you can see this behavior:

function test_fg!h!(F,G,H,p)
    print("Computing: ")
    if !(H == nothing)
        print("hessian, ")
        H[1] = 12.0*p[1]^2
    end
    if !(G == nothing)
        print("gradient, ")
        G[1] = 4.0*p[1]^3
    end
    if !(F == nothing)
        println("function")
        return p[1]^4
    end
end
res = Optim.optimize(Optim.only_fgh!(test_fg!h!), [2.0], NewtonTrustRegion(), Optim.Options(iterations = 100, show_every=1, show_trace=true))

Is it possible to fix this?

Thanks!

Hm, yes I appreciate the comment, and I am aware of the issue. The problem is, that there are many different combinations of what is expensive when called with what etc. I'll think about an option to force it. The problem is that in many cases (not all, thing quadratic or when a model solution is the major computational expense), adding the hessian calculation does indeed bring a computational cost. However, in trust region methods, it is often the case you might reject a step, reduce the radius, and then solve the subproblem again. The hessian does not update here, but you cannot figure out if you are in this case without evaluating the objective. You can create a cache variable that you only update when the parameters change. I think there's a tutorial somewhere. But I'll consider it.