Comments (19)
Heres what I was describing earlier at least
function abs(v::Variable)
@defVar(v.m, aux >= 0)
@addConstraint(v.m, aux >= v)
@addConstraint(v.m, aux >= -v)
return aux
end
from jump.jl.
I think we could have a library of helper functions separate from the core of JuMP that wrap the logic of generating particular formulations. This could include the abs
function above (although with a better name), and more complex IP models like piecewise linear.
from jump.jl.
I can do this in #156 as well (why not)
from jump.jl.
Will the sdp code keep this as an LP?
from jump.jl.
I'll need to restructure the code a bit so it doesn't trigger a call to solveSDP
, but it will be passed to the solver as an LP
from jump.jl.
Can we bring this back since that branch never got merged?
from jump.jl.
@iamed2, what about using @IainNZ's example definition of abs
above?
from jump.jl.
Gets quite ugly when you are using arrays and doesn't work when you are using abs
with different variables:
using JuMP
import JuMP: GenericAffExpr
function abs{V<:GenericAffExpr}(v::Array{V})
m = first(first(v).vars).m
@defVar(m, aux[1:length(v)] >= 0)
@addConstraint(m, aux .>= v)
@addConstraint(m, aux .>= -v)
return aux
end
m = Model()
@defVar(m, a[1:10] >= 0)
@defVar(m, b[1:10] <= 0)
@defVar(m, c[1:10] >= 10)
@defVar(m, d[1:10] <= -10)
julia> sum(abs(a + b))
aux[1] + aux[2] + aux[3] + aux[4] + aux[5] + aux[6] + aux[7] + aux[8] + aux[9] + aux[10]
julia> sum(abs(c + d))
aux[1] + aux[2] + aux[3] + aux[4] + aux[5] + aux[6] + aux[7] + aux[8] + aux[9] + aux[10]
from jump.jl.
What doesn't work? Don't worry about the names of the variables overlapping, they're distinct variables.
from jump.jl.
According do the documentation @defvar will "redefines variables without warning". aux
isn't being overwritten by the second abs
call?
from jump.jl.
@defVar(m,x)
is equivalent to something like x = Variable(m)
. It redefines Julia variables in the local scope, not JuMP variables.
from jump.jl.
Interesting. Maybe I'm missing something but why isn't the following model respecting the constraint given? Both a
and b
both contain all zeros.
using JuMP
import JuMP: GenericAffExpr
function abs{V<:GenericAffExpr}(v::Array{V})
m = first(first(v).vars).m
@defVar(m, aux[1:length(v)] >= 0)
@addConstraint(m, aux .>= v)
@addConstraint(m, aux .>= -v)
return aux
end
m = Model()
@defVar(m, a[1:10] >= 0)
@defVar(m, b[1:10] <= 0)
@addConstraint(m, sum(abs(a + b)) >= 100)
solve(m)
getValue(a)
getValue(b)
from jump.jl.
sum(abs(a + b)) >= 100
is not a convex constraint, if you gave it to Convex.jl it would be rejected. The above trick is a poor man's DCP (disciplined convex programming), so to make it work you have to follow the same rules. If you really want to model that constraint, you'd have to introduce integer variables since it's a nonconvex problem.
from jump.jl.
This gets to the reticence about adding this type of problem transformation to JuMP: it's surprisingly subtle, and will require a fair amount of thought (and code) to get this working robustly. It will also likely end up reimplementing a decent amount of Convex.jl on top of JuMP. Since the JuMP internals aren't really designed for handling this type of transformation, it probably makes sense to put this functionality to an external package, rather than JuMP itself (at least initially).
from jump.jl.
s/reimplementing a decent amount of Convex.jl/reimplementing all of Convex.jl/g
from jump.jl.
Probably not if you're only interesting in adding L_1/L_\infty
from jump.jl.
@mlubin What we wanted was actually sum(abs(a + b)) <= 100
which does work. But the fact that the solver gives back a solution it says is optimal when it clearly violates constraints seems wrong.
We're really only going through this in order to implement norm1. It looks like the abs solution does not work in our actual problem (for some reason unbeknownst to us; we're not optimization experts by any means). We really want to use JuMP so we can use JuMPeR and such but we can't seem to find a way to get it to work. Switching to Convex.jl will present new problems down the line when we want more features but might just be necessary at this point.
from jump.jl.
@iamed2, that's exactly why we don't blindly incorporate these transformations into JuMP. Ensuring validity of the transformation requires DCP which is the reason why systems like CVX and Convex.jl exist.
What is it exactly that's not working at this point? If you're imposing a norm1 constraint then the abs transformation is valid. JuMPeR itself does support some form of norm1 constraints, but I'll let @IainNZ discuss.
from jump.jl.
JuMPeR only supports norm1
constraints for the uncertainty set, and only for "norm ball" type constraints of the form norm1(u) <= G
, which are easy for JuMPeR to handle.
@iamed2 the solver isn't lying, and it doesn't violate the constraints - its just that your abs
is actually abs_with_caveats
- there is nothing JuMP nor the solver can do about that, unfortunately.
from jump.jl.
Related Issues (20)
- Error vcat(::NonlinearExpr, ::VariableRef, ::Float64) HOT 5
- Increase performance of SOC constraints HOT 4
- Iterating SparseAxisArray does not preserve order HOT 4
- How to improve the speed to build a complex model? HOT 2
- Coefficients of complex variables created with a GenericModel are always Float64
- LinearAlgebra.hermitian is incorrect
- *(::Real, ::Hermitian) is not hermitian HOT 5
- Cannot `convert` an object of type Float64 to an object of type JuMP.NonlinearExpr HOT 3
- Documentation Request: List whether a solver supports Indicator Constraints HOT 4
- Bullet point alignment in bibliography HOT 9
- Multiplication of matrix expression and variables leads to stack overflow and matmul error HOT 4
- Multiple Ranges for variables HOT 3
- Y' when Y is of type ::Matrix{NonlinearExpr} HOT 1
- Poor performance in complementarity models HOT 23
- Add a method for Complex(a,b) HOT 8
- MethodError with empty summations HOT 1
- Nonlinear subexpressions
- Should error when trying to mix `NonlinearExpression` and `GenericNonlinearExpr` HOT 1
- Support normalized_coefficient for vector-valued constraints
- Precompilation in CI fails on v1.22 HOT 7
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from jump.jl.