jump-dev / multiobjectivealgorithms.jl Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
This issue is used to trigger TagBot; feel free to unsubscribe.
If you haven't already, you should update your TagBot.yml
to include issue comment triggers.
Please see this post on Discourse for instructions and more details.
If you'd like for me to do this for you, comment TagBot fix
on this issue.
I'll open a PR within a few hours, please be patient!
This example fails because HiGHS doesn't support VariableIndex
objectives
using JuMP
import MultiObjectiveAlgorithms as MOA
import HiGHS
model = Model(() -> MOA.Optimizer(HiGHS.Optimizer))
set_silent(model)
set_attribute(model, MOA.Algorithm(), MOA.KirlikSayin())
@variable(model, x >= 0, Int)
@variable(model, y >= 0, Int)
@constraint(model, x + y >= 1)
@objective(model, Min, [x, y])
optimize!(model)
According my understanding of the code, the results returned by the lines 47-57 (file epsilonconstraint) are in general weaky nondominated, which is not valid for an epsilon-constraint method (a filter has to be applied or the generation handles this case and delete the unexpected points).
Having a function that filters weakly nondominated solutions from the SolutionPoint
set would be useful in general.
The EpsilonConstraint
algorithm uses the Hierarchical
one, leading to solving the model twice at each step.
Edit: I am wrong, the optimization is later.
https://github.com/odow/MultiObjectiveAlgorithms.jl/blob/8654901629d98715c2ff8c4fcbf9f2ff714708e7/src/algorithms/EpsilonConstraint.jl#L74-L79
In some cases, it would be preferable to (1) only have the first optimization at each step, with the drawback of including weakly nondominated solutions in the SolutionPoint set, and (2) to filter these weakly nondominated solutions at the end.
This is also the case for Kirlik and Sayin #11
Suggestion: Adding an option to use Hierarchical or to filter laterAdding a function to filter weakly non dominated solutions
@xgandibleux says:
I have prepared an exercise for my students which is a project planning
problem with 3 objectives, all the variables are discrete (ps: setting
the variables t_i >=0 is enough), see the attached document.When I solve it with MOA.Lexicographic(all_permutations = true) I get a
result (ps: here I have to check the implementation because line 2 and 3
in the solution returned should be exchanged because we expect to get
the minimum on fct2 when fct2 is minimized, which is not the case - but
it is late now, maybe I don't see the obvious-).When MOA.KirlikSayin() is called for solving this problem, a warning
appears and the resolution is stopped: "objective 1 does not have a
finite domain". If I am not wrong the initial conditions for applying KS
on this problem are fulfilled, right (objectives have bounded values).
Do you have any idea why this warning is raised and then the resolution
stopped?
Reproducible example
julia> using JuMP
julia> import MultiObjectiveAlgorithms as MOA
julia> import HiGHS
julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: MOA[algorithm=MultiObjectiveAlgorithms.Lexicographic, optimizer=HiGHS]
julia> set_silent(model)
julia> set_attribute(model, MOA.Algorithm(), MOA.KirlikSayin())
julia> @variable(model, x >= 0, Int)
x
julia> @variable(model, y >= 0, Int)
y
julia> @constraint(model, x + y >= 1)
x + y ≥ 1.0
julia> @objective(model, Min, 1.0 * [x, y])
2-element Vector{AffExpr}:
x
y
julia> optimize!(model)
┌ Warning: Unable to solve problem using `KirlikSayin()` because objective 1 does not have a finite domain.
└ @ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/RdHL3/src/algorithms/KirlikSayin.jl:126
X-ref the implementation in vOptGeneric:
https://github.com/vOptSolver/vOptGeneric.jl/blob/5cdecf89d6fe180bda99b8f20c92b576b13d1db2/src/algorithms.jl#L248-L366
x-ref the implementation in vOptGeneric:
https://github.com/vOptSolver/vOptGeneric.jl/blob/5cdecf89d6fe180bda99b8f20c92b576b13d1db2/src/algorithms.jl#L143-L246
For the following 2IP problem:
max z_1 = x_1 + x_2
min z_2 = x_1 + 3x_2
s.t.
2x_1 + 3x_2 <= 30
3x_1 + 2x_2 <= 30
x_1 + x_2 <= 5.5
x_1 , x_2 \in \mathbb{N}
The lexicographic algorithm returns only one solution, the x=(0,0) corresponding to z=(0,0) is missing.
The code is the following:
using JuMP
import MultiObjectiveAlgorithms as MOA
using GLPK
using Printf
# Setup the model ------------------------------------------------------------
model = Model( )
# Define the variables
@variable(model, x1≥0, Int)
@variable(model, x2≥0, Int)
# Define the objectives
@expression(model, fct1, x1 + x2) # to maximize
@expression(model, fct2, x1 + 3 * x2) # to minimize
@objective(model, Max, [fct1, (-1) * fct2])
# Define the constraints
@constraint(model, 2*x1 + 3*x2 ≤ 30)
@constraint(model, 3*x1 + 2*x2 ≤ 30)
@constraint(model, x1 - x2 ≤ 5.5)
# Setup the solver (lexicographic method) ------------------------------
set_optimizer(model,()->MOA.Optimizer(GLPK.Optimizer))
set_attribute(model, MOA.Algorithm(), MOA.Lexicographic())
# Solve and display results ---------------------------------------------------
optimize!(model)
for i in 1:result_count(model)
z1_opt = objective_value(model; result = i)[1]
z2_opt = -1 * objective_value(model; result = i)[2]
x1_opt = value(x1; result = i)
x2_opt = value(x2; result = i)
@printf( "%2d : z = [%3.0f ,%3.0f] | x1 =%2.0f x2 =%2.0f\n", i, z1_opt, z2_opt, x1_opt, x2_opt )
end
returns:
1 : z = [ 12 , 24] | x1 = 6 x2 = 6
The decision space and the objective space for this 2IP are:
x-ref: https://discourse.julialang.org/t/multiobjectivealgorithms-epsilonconstraint-runs-infinitely/96178
using JuMP
import Ipopt
import MultiObjectiveAlgorithms as MOA
μ = [0.006898463772627643, -0.02972609131603086]
Q = [0.030446 0.00393731; 0.00393731 0.00713285]
Rf = 0
model = Model(() -> MOA.Optimizer(Ipopt.Optimizer))
set_silent(model)
set_optimizer_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())
set_optimizer_attribute(model, MOA.SolutionLimit(), 25)
@variable(model, 0 <= w[1:2] <= 1)
@constraint(model, sum(w) == 1)
@expression(model, variance, w' * Q * w)
@expression(model, expected_return, w' * μ)
@variable(model, sharpe)
@NLconstraint(model, sharpe == (expected_return - Rf) / sqrt(variance))
@objective(model, Max, [expected_return, sharpe])
optimize!(model)
julia> model = Model(() -> MOA.Optimizer(Ipopt.Optimizer))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: MOA[algorithm=MultiObjectiveAlgorithms.Lexicographic, optimizer=Ipopt]
julia> set_optimizer_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())
julia> set_optimizer_attribute(model, MOA.EpsilonConstraintStep(), 0.0001)
julia> set_silent(model)
julia> @variable(model, 0 <= w[1:size(R, 2)] <= 1)
2-element Vector{VariableRef}:
w[1]
w[2]
julia> @constraint(model, sum(w) == 1)
w[1] + w[2] = 1.0
julia> @objective(model, Min, [sum(Q * w), -μ' * w])
2-element Vector{AffExpr}:
0.001281759497122053 w[1] + 0.005981451347159874 w[2]
-0.05470748600000001 w[1] - 0.18257110599999998 w[2]
julia> optimize!(model)
ERROR: MathOptInterface.DeleteNotAllowed{MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}}: Deleting the index MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}(2) cannot be performed. You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode.
Stacktrace:
[1] delete(model::Ipopt.Optimizer, index::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}})
@ MathOptInterface ~/.julia/packages/MathOptInterface/NCblk/src/indextypes.jl:145
[2] delete(model::MultiObjectiveAlgorithms.Optimizer, ci::MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}})
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/MultiObjectiveAlgorithms.jl:427
[3] optimize_multiobjective!(algorithm::MultiObjectiveAlgorithms.Hierarchical, model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/algorithms/Hierarchical.jl:126
[4] optimize_multiobjective!(algorithm::MultiObjectiveAlgorithms.EpsilonConstraint, model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/algorithms/EpsilonConstraint.jl:78
[5] optimize!(model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/MultiObjectiveAlgorithms.jl:439
[6] optimize!
@ ~/.julia/packages/MathOptInterface/NCblk/src/Bridges/bridge_optimizer.jl:376 [inlined]
[7] optimize!
@ ~/.julia/packages/MathOptInterface/NCblk/src/MathOptInterface.jl:83 [inlined]
[8] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/NCblk/src/Utilities/cachingoptimizer.jl:316
[9] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ JuMP ~/.julia/packages/JuMP/7XtRG/src/optimizer_interface.jl:480
[10] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/7XtRG/src/optimizer_interface.jl:458
[11] top-level scope
@ REPL[365]:1
I was making timestamps to Cows, Lakes, and a JuMP extension... talk from JuliaCon 2017 and I was searching for link for MOO and Moa.jl. Now I quite confused, which is which. Can someone explain to me how this currently works?
https://discourse.julialang.org/t/jump-non-linear-optimization/94020/7
julia> using JuMP
julia> import DataFrames
julia> import Gurobi
julia> import MultiObjectiveAlgorithms as MOA
julia> import Plots
julia> import Statistics
julia> df = DataFrames.DataFrame(
bond = [0.06276629, 0.03958098, 0.08456482,0.02759821,0.09584956,0.06363253,0.02874502,0.02707264,0.08776449,0.02950032],
stock = [0.1759782,0.20386651,0.21993588,0.3090001,0.17365969,0.10465274,0.07888138,0.13220847,0.28409742,0.14343067],
)
10×2 DataFrame
Row │ bond stock
│ Float64 Float64
─────┼──────────────────────
1 │ 0.0627663 0.175978
2 │ 0.039581 0.203867
3 │ 0.0845648 0.219936
4 │ 0.0275982 0.309
5 │ 0.0958496 0.17366
6 │ 0.0636325 0.104653
7 │ 0.028745 0.0788814
8 │ 0.0270726 0.132208
9 │ 0.0877645 0.284097
10 │ 0.0295003 0.143431
julia> R = Matrix(df)
10×2 Matrix{Float64}:
0.0627663 0.175978
0.039581 0.203867
0.0845648 0.219936
0.0275982 0.309
0.0958496 0.17366
0.0636325 0.104653
0.028745 0.0788814
0.0270726 0.132208
0.0877645 0.284097
0.0295003 0.143431
julia> μ = vec(Statistics.mean(R; dims = 1))
2-element Vector{Float64}:
0.05470748600000001
0.18257110599999998
julia> Q = Statistics.cov(R)
2×2 Matrix{Float64}:
0.00076204 0.00051972
0.00051972 0.00546173
julia> model = Model(() -> MOA.Optimizer(Gurobi.Optimizer))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: MOA[algorithm=MultiObjectiveAlgorithms.Lexicographic, optimizer=Gurobi]
julia> set_optimizer_attribute(model, MOA.Algorithm(), MOA.EpsilonConstraint())
julia> set_optimizer_attribute(model, MOA.EpsilonConstraintStep(), 0.0001)
julia> set_silent(model)
julia> @variable(model, 0 <= w[1:size(R, 2)] <= 1)
2-element Vector{VariableRef}:
w[1]
w[2]
julia> @constraint(model, sum(w) == 1)
w[1] + w[2] = 1.0
julia> @objective(model, Min, [w' * Q * w, -μ' * w])
2-element Vector{QuadExpr}:
0.0007620396103762265 w[1]² + 0.0010394397734916532 w[2]*w[1] + 0.005461731460414048 w[2]²
-0.05470748600000001 w[1] - 0.18257110599999998 w[2]
julia> optimize!(model)
ERROR: MethodError: no method matching _scalarise(::MathOptInterface.VectorQuadraticFunction{Float64}, ::Vector{Float64})
Closest candidates are:
_scalarise(::MathOptInterface.VectorOfVariables, ::Vector{Float64}) at /Users/oscar/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/MultiObjectiveAlgorithms.jl:54
_scalarise(::MathOptInterface.VectorAffineFunction, ::Vector{Float64}) at /Users/oscar/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/MultiObjectiveAlgorithms.jl:62
Stacktrace:
[1] optimize_multiobjective!(algorithm::MultiObjectiveAlgorithms.Hierarchical, model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/algorithms/Hierarchical.jl:100
[2] optimize_multiobjective!(algorithm::MultiObjectiveAlgorithms.EpsilonConstraint, model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/algorithms/EpsilonConstraint.jl:78
[3] optimize!(model::MultiObjectiveAlgorithms.Optimizer)
@ MultiObjectiveAlgorithms ~/.julia/packages/MultiObjectiveAlgorithms/IhQGz/src/MultiObjectiveAlgorithms.jl:439
[4] optimize!
@ ~/.julia/packages/MathOptInterface/NCblk/src/Bridges/bridge_optimizer.jl:376 [inlined]
[5] optimize!
@ ~/.julia/packages/MathOptInterface/NCblk/src/MathOptInterface.jl:83 [inlined]
[6] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/NCblk/src/Utilities/cachingoptimizer.jl:316
[7] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ JuMP ~/.julia/packages/JuMP/7XtRG/src/optimizer_interface.jl:480
[8] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/7XtRG/src/optimizer_interface.jl:458
[9] top-level scope
@ REPL[342]:1
We're going to tag a release of MOI with multi-objective support soon (jump-dev/MathOptInterface.jl#2090), and this repo contains a collection of solvers that are similar to the purpose of https://github.com/vOptSolver/vOptGeneric.jl.
There's no point duplicating work, so should we pick one of:
jump-dev/MOO.jl
jump-dev
and rename to MultiObjectiveAlgorithms.jl
(to be bikeshed)vOpt.jl
Thoughts @mlubin @blegat @joaquimg @xgandibleux?
See
MultiObjectiveAlgorithms.jl/test/algorithms/DominguezRios.jl
Lines 586 to 601 in 948ca31
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.