Giter Club home page Giter Club logo

Comments (19)

IainNZ avatar IainNZ commented on May 22, 2024

Heres what I was describing earlier at least

function abs(v::Variable)
  @defVar(v.m, aux >= 0)
  @addConstraint(v.m, aux >= v)
  @addConstraint(v.m, aux >= -v)
  return aux
end

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

I think we could have a library of helper functions separate from the core of JuMP that wrap the logic of generating particular formulations. This could include the abs function above (although with a better name), and more complex IP models like piecewise linear.

from jump.jl.

joehuchette avatar joehuchette commented on May 22, 2024

I can do this in #156 as well (why not)

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

Will the sdp code keep this as an LP?

from jump.jl.

joehuchette avatar joehuchette commented on May 22, 2024

I'll need to restructure the code a bit so it doesn't trigger a call to solveSDP, but it will be passed to the solver as an LP

from jump.jl.

iamed2 avatar iamed2 commented on May 22, 2024

Can we bring this back since that branch never got merged?

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

@iamed2, what about using @IainNZ's example definition of abs above?

from jump.jl.

omus avatar omus commented on May 22, 2024

Gets quite ugly when you are using arrays and doesn't work when you are using abs with different variables:

using JuMP
import JuMP: GenericAffExpr

function abs{V<:GenericAffExpr}(v::Array{V})
    m = first(first(v).vars).m
    @defVar(m, aux[1:length(v)] >= 0)
    @addConstraint(m, aux .>= v)
    @addConstraint(m, aux .>= -v)
    return aux
end

m = Model()
@defVar(m, a[1:10] >= 0)
@defVar(m, b[1:10] <= 0)
@defVar(m, c[1:10] >= 10)
@defVar(m, d[1:10] <= -10)
julia> sum(abs(a + b))
aux[1] + aux[2] + aux[3] + aux[4] + aux[5] + aux[6] + aux[7] + aux[8] + aux[9] + aux[10]

julia> sum(abs(c + d))
aux[1] + aux[2] + aux[3] + aux[4] + aux[5] + aux[6] + aux[7] + aux[8] + aux[9] + aux[10]

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

What doesn't work? Don't worry about the names of the variables overlapping, they're distinct variables.

from jump.jl.

omus avatar omus commented on May 22, 2024

According do the documentation @defvar will "redefines variables without warning". aux isn't being overwritten by the second abs call?

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

@defVar(m,x) is equivalent to something like x = Variable(m). It redefines Julia variables in the local scope, not JuMP variables.

from jump.jl.

omus avatar omus commented on May 22, 2024

Interesting. Maybe I'm missing something but why isn't the following model respecting the constraint given? Both a and b both contain all zeros.

using JuMP
import JuMP: GenericAffExpr

function abs{V<:GenericAffExpr}(v::Array{V})
    m = first(first(v).vars).m
    @defVar(m, aux[1:length(v)] >= 0)
    @addConstraint(m, aux .>= v)
    @addConstraint(m, aux .>= -v)
    return aux
end

m = Model()
@defVar(m, a[1:10] >= 0)
@defVar(m, b[1:10] <= 0)
@addConstraint(m, sum(abs(a + b)) >= 100)

solve(m)
getValue(a)
getValue(b)

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

sum(abs(a + b)) >= 100 is not a convex constraint, if you gave it to Convex.jl it would be rejected. The above trick is a poor man's DCP (disciplined convex programming), so to make it work you have to follow the same rules. If you really want to model that constraint, you'd have to introduce integer variables since it's a nonconvex problem.

from jump.jl.

joehuchette avatar joehuchette commented on May 22, 2024

This gets to the reticence about adding this type of problem transformation to JuMP: it's surprisingly subtle, and will require a fair amount of thought (and code) to get this working robustly. It will also likely end up reimplementing a decent amount of Convex.jl on top of JuMP. Since the JuMP internals aren't really designed for handling this type of transformation, it probably makes sense to put this functionality to an external package, rather than JuMP itself (at least initially).

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

s/reimplementing a decent amount of Convex.jl/reimplementing all of Convex.jl/g

from jump.jl.

joehuchette avatar joehuchette commented on May 22, 2024

Probably not if you're only interesting in adding L_1/L_\infty

from jump.jl.

iamed2 avatar iamed2 commented on May 22, 2024

@mlubin What we wanted was actually sum(abs(a + b)) <= 100 which does work. But the fact that the solver gives back a solution it says is optimal when it clearly violates constraints seems wrong.

We're really only going through this in order to implement norm1. It looks like the abs solution does not work in our actual problem (for some reason unbeknownst to us; we're not optimization experts by any means). We really want to use JuMP so we can use JuMPeR and such but we can't seem to find a way to get it to work. Switching to Convex.jl will present new problems down the line when we want more features but might just be necessary at this point.

from jump.jl.

mlubin avatar mlubin commented on May 22, 2024

@iamed2, that's exactly why we don't blindly incorporate these transformations into JuMP. Ensuring validity of the transformation requires DCP which is the reason why systems like CVX and Convex.jl exist.

What is it exactly that's not working at this point? If you're imposing a norm1 constraint then the abs transformation is valid. JuMPeR itself does support some form of norm1 constraints, but I'll let @IainNZ discuss.

from jump.jl.

IainNZ avatar IainNZ commented on May 22, 2024

JuMPeR only supports norm1 constraints for the uncertainty set, and only for "norm ball" type constraints of the form norm1(u) <= G, which are easy for JuMPeR to handle.

@iamed2 the solver isn't lying, and it doesn't violate the constraints - its just that your abs is actually abs_with_caveats - there is nothing JuMP nor the solver can do about that, unfortunately.

from jump.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.