Comments (13)
@constraint(model, [x, F(x), b - A * x] in MOI.VariationalInequality(2length(x), MOI.Nonnegatives(size(A, 1))))
I was thinking something about this. But it's pretty clunky to build. I think people would want to build up the polyhedron with @constraint
, instead of b
and A
.
from mathoptinterface.jl.
What about
@constraint(model, [x, F(x), b - A * x] in MOI.VariationalInequality(2length(x), MOI.Nonnegatives(size(A, 1))))
It generalizes
@constraint(model, [x, F(x)] in MOI.Complement(2length(x)))
as MOI.Complement(n)
is now equivalent to MOI.Complement(n, MOI.Nonnegatives(0))
.
The advantage of this approach is that you could write a bridge that gets the polyhedron description, computes the double description and then do some fancy reformulation.
Another approach is using constraint attributes but it's a bit orthogonal to the current design of the other constraints where we prefer a nice self-contained function-in-set representation (also plays better with MathOptSetDistances, Dualization, and the rest of our abstraction) than having the info spread out
from mathoptinterface.jl.
@xhub do you have some simple pedagogical examples of polyhedral VI's that people would want to solve? (In the vein of https://jump.dev/JuMP.jl/stable/tutorials/nonlinear/complementarity/)
Pick whatever syntax you like for now.
from mathoptinterface.jl.
Many interesting AVI comes from solving Nash equilibrium. That's not the simplest examples.
I do have one example of solving a friction contact problem. Next 2 weeks are quite busy, I'll try to provide something afterwards.
from mathoptinterface.jl.
Okay, so we could make some nice syntax:
model = Model()
@variable(model, x[1:2] >= 0)
F(x) = [1 - x[1], 1 - x[2]]
C = [@build_constraint(model, sum(x) == 1)]
@constraint(model, F(x) ⟂ x, C)
which lowers to:
model = MOI.Utilities.Model{Float64}()
x = MOI.add_variables(model, 2)
MOI.add_constraint.(model, x, MOI.GreaterThan(0.0))
F(x) = [1 - x[1], 1 - x[2]]
C_f = [sum(x) - 1]
C_s = MOI.Zeros(1)
f = MOI.Utilities.operate(vcat, Float64, F(x), x, C_f)
s = MOI.VariationalInequality(2 * length(x), C_s)
MOI.add_constraint(model, f, s)
The downside is that you'd have to provide the VI in a single call. It wouldn't necessarily make sense to do something like:
model = Model()
@variable(model, x[1:2] >= 0)
C = [@build_constraint(model, sum(x) == 1)]
@constraint(model, 1 - x[1] ⟂ x[1], C)
@constraint(model, 1 - x[2] ⟂ x[2])
from mathoptinterface.jl.
I must say I always found it a bit weird that the complementarity constraint implicitly uses the variable bounds in:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, 0 <= x[1:4] <= 10, start = 0)
@constraint(model, M * x + q ⟂ x)
maybe this is a good opportunity to improve the syntax (of course we'd need to keep the old one to not break code).
The standard complementarity could be something like:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, x[1:4] , start = 0)
@constraint(model, M * x + q ⟂ x, begin
[i=1:4], 0 <= x[i] <= 10
end)
And the new VI could be:
model = Model(PATHSolver.Optimizer)
set_silent(model)
@variable(model, x[1:2])
@constraint(model, [1 - x[1], 1 - x[2]] ⟂ x, begin
sum(x) == 1
end)
@build_constraint
feels unnatural for the base use case.
I could understand it being used as secondary syntax:
model = Model()
@variable(model, x)
@variable(model, y)
V = [x, y]
F = [@expression(model, 1 - x[1]), @expression(model, 1 - x[2])]
C = [@build_constraint(model, x + y == 1)]
@constraint(model, F ⟂ V, C)
from mathoptinterface.jl.
The latter would be equivalent to @blegat 's:
@constraint(model, [V, F, x+y] in MOI.VariationalInequality(2length(x), [MOI.EqualTo(1)]))
from mathoptinterface.jl.
I must say I always found it a bit weird that the complementarity constraint implicitly uses the variable bounds
Okaaaay. So. The backstory for this is:
- It's the format that PATH supports, where it requires
F(x)
, the JacobianJ(x)
, a vector of lower boundslb
, a vector of upper boundsub
and a starting pointz
: https://github.com/chkwon/PATHSolver.jl/blob/7d6de732efd4db2f9b9f70e6d315d95f8e200f30/src/C_API.jl#L171-L175 - Other packages like Pyomo, GAMS, AMPL, Complementarity.jl etc let you write
0 <= F(x) ⟂ x >= 0
orF(x) ⟂ a <= x <= b
, or in some cases,0 <= F(x) ⟂ a <= x <= b
. This can easily lead to situation which are not valid MCPs.
For (2), see, for example
- https://or.stackexchange.com/questions/9511/convert-mcp-model-from-gams-to-pyomo
- or, in the GAMS case, https://www.gams.com/latest/docs/S_PATH.html
This design decision in GAMS is VeryBad™ because it is a silent cause of bugs.
Michael and I had some very long discussions, and decided that the one true way to write an MCP was what we currently have. Users cannot give bounds on F(x)
or x
in the complementarity condition, because experience has shown that they will get them wrong.
If we have AVIs, then I do see some reason to support:
model = Model(PATHSolver.Optimizer)
@variable(model, x[1:4], start = 0)
@constraint(model, F(x) ⟂ {a <= x <= b})
[F(x), x]
in MOI.VariationalInequality(2length(x), Interval.(a, b))
.
But if we have MPECs, then it's a bit weird, because you could have both variable bounds an a rectangular VI???
What would I do in this situation?
model = Model()
@variable(model, -1 <= x <= 1, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
Are the bounds of x
[-1, 1]
, or [0, 1]
?
from mathoptinterface.jl.
But if we have MPECs, then it's a bit weird, because you could have both variable bounds an a rectangular VI???
What would I do in this situation?
Hard to follow your train of thought on that one without an example. I agree that VI constraints in an optimization problem usually require further clarification of the semantics
Are the bounds of x [-1, 1], or [0, 1]?
Throw an error and enlighten the modeler of their mistake.
Why not support both and throw an error if bounds on the variables are given twice (in @variable
and @constraint
)?
from mathoptinterface.jl.
Hard to follow your train of thought on that one without an example
I meant the example immediately following.
Why not support both and throw an error if bounds on the variables are given twice
We could do this.
How about:
model = Model()
@variable(model, x, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
lower_bound(x) # 0? or an error?
set_lower_bound(x, 1) # Modifies the VI? Or adds a bound (and therefore throws an error)?
from mathoptinterface.jl.
Interestingly, for me it is very clear that what should happen.
If we have:
model = Model()
@variable(model, x, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
Then
has_lower_bound(x) == false
and
lower_bound(x)
Errors!
As x >= 0
is PART of the constraint, and there is no bound in the variable.
In the same way that @constraint(model, x >= 0)
would no affect the value returned by lower_bound(x)
And
set_lower_bound(x, 1)
Modifies the variable bounds and NOT the VI.
In the same way that set_lower_bound(x, 1)
would not affect a @constraint(model, x >= 0)
.
There would be no way to modify that part of the complementarity constraint as of today.
However, we can add a new specific function for that or simply force users to delete and re-add.
Complementarity solvers probably won't have in-place modifications and nice startups, so the new function could be created if it is needed in the future.
My understanding probably comes from the fact that I am more of an MPEC than an MCP person.
About:
model = Model()
@variable(model, -1 <= x <= 1, start = 0)
@objective(model, Max, x)
@constraint(model, 1 - x ⟂ {x >= 0})
MCP solvers should throw an error like @xhub said. Probably should even check for @constraint(model, x >= 0)
kind of errors as well.
But it is a perfectly valid MPEC.
In particular, I suffer with this in BilevelJuMP, as the implicit bounds in the variables mess things up.
BilevelJuMP would greatly appreciate it if complementarity/VI-like constraints were defined in a single self-sufficient piece.
This is because BilevelJuMP accepts MPECs. Hence, something like the above invalid MCP would be valid in BilevelJuMP.
from mathoptinterface.jl.
Also, knitro only accepts 0<= x ⟂ y >= 0
, no other bounds, only variables.
https://www.artelys.com/app/docs/knitro/2_userGuide/complementarity.html
from mathoptinterface.jl.
So KNITRO would be VectorOfVariables
in VariationalInequality{Nonnegatives}
from mathoptinterface.jl.
Related Issues (20)
- [FileFormats.MPS] Problem when reading generated MPS with CPLEX HOT 3
- [Nonlinear] add initialize timer HOT 1
- [FileFormats.NL] cannot read models with S section
- [FileFormats.NL] write free rows
- ReverseAD doesn't error for out-of-bound writes
- MOI slow to import and precompile HOT 4
- Improve performance of String names
- Reading .mps file with NAME HOT 2
- Add support for nonlinear sign(x) HOT 2
- Support for complex vector cones HOT 35
- Bug get/setting ConstraintSet with variable bridges HOT 9
- [FileFormats.CBF] add support for PSDVAR HOT 31
- [FileFormats.CBF] keep track of variables after reading and writing HOT 3
- [FileFormats.MPS] wrong result after parsing file HOT 10
- [FileFormats.MPS] another potential wrong result HOT 13
- [Nonlinear] detect common subexpressions HOT 5
- Order of columns during copy_to HOT 1
- Debug performance issue in Nonlinear submodule HOT 8
- Return type after querying attributes of empty vectors
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mathoptinterface.jl.