JuliaStats / GLM.jl

Generalized linear models in Julia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Feature request: Support ForwardDiff automatic differentiation

yangkeunyun opened this issue · comments

Hi,

It seems that GLM.jl does not support ForwardDiff automatic differentiation. It could be useful if it is possible to use GLM.jl package along with ForwarDiff AD. Or would it be possible to circumvent the problem while using GLM and ForwardDiff AD?

Below is the MWE and Stacktrace:

using Distributions
using CSV, DataFrames
using GLM
using Optim, ForwardDiff
using StatsBase, LinearAlgebra

# Generate random data
N = 5 
y = rand(Normal(0,1), N) .> 1 # dependent variable (one or zero) in the regression below 
X = rand(Normal(0,1), N,2) # independent variables
z = [y X] # Define dataset in matrix format

# Objective function
function g(z,θ)
    X̃ = z[:,2:end]*θ # mutate X̃ with Dual Type as ForwardDiff AD is implemented

    df = DataFrame([z[:,1] X̃])
    probit = glm(@formula(x1 ~ x2), df, Binomial(), ProbitLink()) # then error is made here
    pred_y = predict(probit)

    Z = [z[:,2] pred_y]
    transpose(Z)
end

θ₀ = [.5,.5]
W = inv(X'X)
f(θ) = (ḡ = mean(g(z,θ), dims=2); dot(ḡ, W, ḡ)) # GMM wrapper
result = Optim.optimize(f, θ₀, LBFGS(); autodiff=:forward) # run optimization
ERROR: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{typeof(f), Float64}, Float64, 2})
Closest candidates are:
  (::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat at C:\Users\yangk\AppData\Local\Programs\julia-1.7.3\share\julia\base\rounding.jl:200
  (::Type{T})(::T) where T<:Number at C:\Users\yangk\AppData\Local\Programs\julia-1.7.3\share\julia\base\boot.jl:770
  (::Type{T})(::AbstractChar) where T<:Union{AbstractChar, Number} at C:\Users\yangk\AppData\Local\Programs\julia-1.7.3\share\julia\base\char.jl:50
  ...
Stacktrace:
  [1] convert(#unused#::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{typeof(f), Float64}, Float64, 2})
    @ Base .\number.jl:7
  [2] setindex!(A::Matrix{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{typeof(f), Float64}, Float64, 2}, i1::Int64)
    @ Base .\array.jl:903
  [3] _unsafe_copyto!(dest::Matrix{Float64}, doffs::Int64, src::Matrix{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f), Float64}, Float64, 2}}, soffs::Int64, n::Int64)
    @ Base .\array.jl:253
  [4] unsafe_copyto!
    @ .\array.jl:307 [inlined]
  [5] _copyto_impl!
    @ .\array.jl:331 [inlined]
  [6] copyto!
    @ .\array.jl:317 [inlined]
nceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, initial_x::Vector{Float64})
    @ Optim C:\Users\yangk\.julia\packages\Optim\6Lpjy\src\multivariate\solvers\first_order\l_bfgs.jl:164
 [26] optimize
    @ C:\Users\yangk\.julia\packages\Optim\6Lpjy\src\multivariate\optimize\optimize.jl:36 [inlined]
 [27] #optimize#89
    @ C:\Users\yangk\.julia\packages\Optim\6Lpjy\src\multivariate\optimize\interface.jl:142 [inlined]
 [28] top-level scope
    @ c:\Users\yangk\Desktop\Question_JuliaLang.jl:28