Transformers.jl Package Installation Broken!
jackn11 opened this issue · comments
I have tried several times deleting the transformers package in /packages, making a new virtual environment (activate newenv
), adding CUDA and Transformers, but every time when I run using transformers, I get the following output.
julia> using Transformers
[ Info: Precompiling Transformers [21ca0261-441d-5938-ace7-c90938fde4d4]
ERROR: LoadError: UndefVarError: applychain not defined
Stacktrace:
[1] include(mod::Module, _path::String)
@ Base .\Base.jl:418
[2] include(x::String)
@ Transformers.Basic C:\Users\jackn\.julia\packages\Transformers\K1F88\src\basic\Basic.jl:1
[3] top-level scope
@ C:\Users\jackn\.julia\packages\Transformers\K1F88\src\basic\Basic.jl:25
[4] include(mod::Module, _path::String)
@ Base .\Base.jl:418
[5] include(x::String)
@ Transformers C:\Users\jackn\.julia\packages\Transformers\K1F88\src\Transformers.jl:1
[6] top-level scope
@ C:\Users\jackn\.julia\packages\Transformers\K1F88\src\Transformers.jl:70
[7] include
@ .\Base.jl:418 [inlined]
[8] include_package_for_output(pkg::Base.PkgId, input::String, depot_path::Vector{String}, dl_load_path::Vector{String}, load_path::Vector{String}, concrete_deps::Vector{Pair{Base.PkgId, UInt64}}, source::Nothing)
@ Base .\loading.jl:1318
[9] top-level scope
@ none:1
[10] eval
@ .\boot.jl:373 [inlined]
[11] eval(x::Expr)
@ Base.MainInclude .\client.jl:453
[12] top-level scope
@ none:1
in expression starting at C:\Users\jackn\.julia\packages\Transformers\K1F88\src\basic\extend3d.jl:4
in expression starting at C:\Users\jackn\.julia\packages\Transformers\K1F88\src\basic\Basic.jl:1
in expression starting at C:\Users\jackn\.julia\packages\Transformers\K1F88\src\Transformers.jl:1
ERROR: Failed to precompile Transformers [21ca0261-441d-5938-ace7-c90938fde4d4] to C:\Users\jackn\.julia\compiled\v1.7\Transformers\jl_FB9C.tmp.
Stacktrace:
[1] error(s::String)
@ Base .\error.jl:33
[2] compilecache(pkg::Base.PkgId, path::String, internal_stderr::IO, internal_stdout::IO, ignore_loaded_modules::Bool)
@ Base .\loading.jl:1466
[3] compilecache(pkg::Base.PkgId, path::String)
@ Base .\loading.jl:1410
[4] _require(pkg::Base.PkgId)
@ Base .\loading.jl:1120
[5] require(uuidkey::Base.PkgId)
@ Base .\loading.jl:1013
[6] require(into::Module, mod::Symbol)
@ Base .\loading.jl:997
[7] eval
@ .\boot.jl:373 [inlined]
[8] eval
@ .\Base.jl:68 [inlined]
[9] repleval(m::Module, code::Expr, #unused#::String)
@ VSCodeServer c:\Users\jackn\.vscode\extensions\julialang.language-julia-1.6.24\scripts\packages\VSCodeServer\src\repl.jl:157
[10] (::VSCodeServer.var"#78#80"{Module, Expr, REPL.LineEditREPL, REPL.LineEdit.Prompt})()
@ VSCodeServer c:\Users\jackn\.vscode\extensions\julialang.language-julia-1.6.24\scripts\packages\VSCodeServer\src\repl.jl:123
[11] with_logstate(f::Function, logstate::Any)
@ Base.CoreLogging .\logging.jl:511
[12] with_logger
@ .\logging.jl:623 [inlined]
[13] (::VSCodeServer.var"#77#79"{Module, Expr, REPL.LineEditREPL, REPL.LineEdit.Prompt})()
@ VSCodeServer c:\Users\jackn\.vscode\extensions\julialang.language-julia-1.6.24\scripts\packages\VSCodeServer\src\repl.jl:124
[14] #invokelatest#2
@ .\essentials.jl:716 [inlined]
[15] invokelatest(::Any)
@ Base .\essentials.jl:714
[16] macro expansion
@ c:\Users\jackn\.vscode\extensions\julialang.language-julia-1.6.24\scripts\packages\VSCodeServer\src\eval.jl:34 [inlined]
[17] (::VSCodeServer.var"#60#61")()
@ VSCodeServer .\task.jl:423
@chengchingwen the issue may be with this commit here: b1b1f9f#diff-09a5eabfb1fc24cdcb25cfa836f64c5387fb27458249ae1a2e48b9ace4a66ab4
Or perhaps that is the solution but it is not yet merged to main?
Note, I tested that commit and it allowed me to run using transformers
but then broke while training the bert model
I made a comment on your commit at the following link which yields the solution to the issue that should be implemented. Changes should be made on lines 43 and 46 of extend3d.jl. Please see b1b1f9f
You can also downgrade Flux to v0.13.3, the error is introduced by Flux v0.13.4.
Fixed in 0.1.18