Thermal Module Empty Collection Reduction
AlannLiu opened this issue · comments
Hello. When I try to run GenX, the code stops at the thermal module.
This is my error message:
ERROR: LoadError: MethodError: reducing over an empty collection is not allowed; consider supplying init
to the reducer
Stacktrace:
[1] reduce_empty(op::Base.MappingRF{JuMP.Containers.var"#47#48"{typeof(identity)}, Base.BottomRF{typeof(Base.add_sum)}}, #unused#::Type{JuMP.VariableRef})
@ Base .\reduce.jl:356
[2] reduce_empty_iter
@ .\reduce.jl:379 [inlined]
[3] reduce_empty_iter
@ .\reduce.jl:378 [inlined]
[4] foldl_impl
@ .\reduce.jl:49 [inlined]
[5] mapfoldl_impl
@ .\reduce.jl:44 [inlined]
[6] #mapfoldl#288
@ .\reduce.jl:170 [inlined]
[7] mapfoldl
@ .\reduce.jl:170 [inlined]
[8] #mapreduce#292
@ .\reduce.jl:302 [inlined]
[9] mapreduce
@ .\reduce.jl:302 [inlined]
[10] #sum#295
@ .\reduce.jl:530 [inlined]
[11] sum
@ .\reduce.jl:530 [inlined]
[12] #sum#296
@ .\reduce.jl:559 [inlined]
[13] sum
@ .\reduce.jl:559 [inlined]
[14] #sum#46
@ C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\DenseAxisArray.jl:831 [inlined]
[15] sum
@ C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\DenseAxisArray.jl:825 [inlined]
[16] #sum#49
@ C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\DenseAxisArray.jl:841 [inlined]
[17] sum(x::JuMP.Containers.DenseAxisArray{JuMP.VariableRef, 1, Tuple{Vector{Int64}}, Tuple{JuMP.Containers._AxisLookup{Dict{Int64, Int64}}}})
@ JuMP.Containers C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\DenseAxisArray.jl:840
[18] macro expansion
@ C:\Users\alanl.julia\packages\MutableArithmetics\h0wjj\src\rewrite.jl:322 [inlined]
[19] macro expansion
@ C:\Users\alanl.julia\packages\JuMP\jZvaU\src\macros.jl:1071 [inlined]
[20] (::GenX.var"#687#702"{JuMP.Model, Vector{Int64}, Int64})(y::Int64, t::Int64)
@ GenX C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\macro.jl:301
[21] #87
@ C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\container.jl:124 [inlined]
[22] iterate
@ .\generator.jl:47 [inlined]
[23] collect(itr::Base.Generator{JuMP.Containers.VectorizedProductIterator{Tuple{Vector{Int64}, Base.OneTo{Int64}}}, JuMP.Containers.var"#87#89"{GenX.var"#687#702"{JuMP.Model, Vector{Int64}, Int64}}})
@ Base .\array.jl:782
[24] map(f::Function, A::JuMP.Containers.VectorizedProductIterator{Tuple{Vector{Int64}, Base.OneTo{Int64}}})
@ Base .\abstractarray.jl:3289
[25] container(f::Function, indices::JuMP.Containers.VectorizedProductIterator{Tuple{Vector{Int64}, Base.OneTo{Int64}}}, ::Type{JuMP.Containers.DenseAxisArray}, names::Vector{Any})
@ JuMP.Containers C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\container.jl:123
[26] container(f::Function, indices::JuMP.Containers.VectorizedProductIterator{Tuple{Vector{Int64}, Base.OneTo{Int64}}}, #unused#::Type{JuMP.Containers.AutoContainerType}, names::Vector{Any})
@ JuMP.Containers C:\Users\alanl.julia\packages\JuMP\jZvaU\src\Containers\container.jl:75
[27] thermal_commit!(EP::JuMP.Model, inputs::Dict{Any, Any}, setup::Dict{Any, Any})
@ GenX .\Users\alanl\OneDrive\Desktop\GenX\src\model\resources\thermal\thermal_commit.jl:214
[28] thermal!(EP::JuMP.Model, inputs::Dict{Any, Any}, setup::Dict{Any, Any})
@ GenX .\Users\alanl\OneDrive\Desktop\GenX\src\model\resources\thermal\thermal.jl:19
[29] generate_model(setup::Dict{Any, Any}, inputs::Dict{Any, Any}, OPTIMIZER::MathOptInterface.OptimizerWithAttributes)
@ GenX .\Users\alanl\OneDrive\Desktop\GenX\src\model\generate_model.jl:173
[30] macro expansion
@ .\timing.jl:393 [inlined]
[31] run_genx_case_simple!(case::String, mysetup::Dict{Any, Any})
@ GenX .\Users\alanl\OneDrive\Desktop\GenX\src\case_runners\case_runner.jl:62
[32] run_genx_case!(case::String)
@ GenX .\Users\alanl\OneDrive\Desktop\GenX\src\case_runners\case_runner.jl:21
[33] top-level scope
@ C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\Cases\case_20\Run.jl:3
[34] include
@ .\client.jl:478 [inlined]
[35] run_job_sequential(i::Int64)
@ Main C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:42
[36] run_job(i::Int64)
@ Main C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:32
[37] launch_new_case(i::Int64, df::DataFrame, files_with_keys::Vector{String})
@ Main C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:379
[38] launch_new_cases()
@ Main C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:394
[39] top-level scope
@ C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:399
[40] include(fname::String)
@ Base.MainInclude .\client.jl:478
[41] top-level scope
@ REPL[8]:1
in expression starting at C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\Cases\case_20\Run.jl:3
in expression starting at C:\Users\alanl\OneDrive\Desktop\GenX\Example_Systems\RealSystemExample\ISONE_Singlezone\caserunner.jl:399
This are my packages:
Project GenX v0.3.5
Status C:\Users\alanl\OneDrive\Desktop\GenX\Project.toml
[336ed68f] CSV v0.10.11
[9961bab8] Cbc v1.1.1
[e2554f3b] Clp v1.0.3
[aaaa29a8] Clustering v0.15.4
[861a8166] Combinatorics v1.0.2
[a93c6f00] DataFrames v1.6.1
[864edb3b] DataStructures v0.18.14
[b4f34e82] Distances v0.10.9
[2e9cd046] Gurobi v1.0.1
[87dc4568] HiGHS v1.5.2
[4076af6c] JuMP v1.13.0
[b8f27783] MathOptInterface v1.18.0
[731186ca] RecursiveArrayTools v2.38.7
[2913bbd2] StatsBase v0.34.0
[ddb6d928] YAML v0.4.9
[d7ed1dd3] MUMPS_seq_jll v500.600.100+0
[ade2ca70] Dates
[37e2e46d] LinearAlgebra
[9a3f8284] Random
[10745b16] Statistics v1.9.0
I am using Julia 1.9.1
Hi Alann,
This issue was fixed in GenX v0.3.6
, and is due to changes made in JuMP
v1.13
; I believe we had previously been using v1.11
. We'd recommend that you update to v0.3.6
.
If for some reason you cannot update GenX, but you can modify your installation, make the changes as seen in the commit here: 7c85cc7
I believe this issue is caused when some thermal generators have Up_Time
or Down_Time
of 0
.
We were able to fix this issue with your recommendation. Thank you.
Glad to hear it.