Giter Club home page Giter Club logo

dualdecomposition.jl's People

Contributors

github-actions[bot] avatar hideakiv avatar kibaekkim avatar michel2323 avatar wzhangw avatar youngdae avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dualdecomposition.jl's Issues

Applications

Regarding the technical manuscript, I have created two branches:

  • apps/all: a branch to combine all applications later
  • apps/uc: stochastic unit commitment application

Can you guys also create a branch like that (e.g., apps/time, apps/network, apps/multi) to work on your own application?

Nonconvex termination status

The package does not consider termination statuses for block models other than MOI.OPTIMAL and MOI.LOCALLY_SOLVED. For block models that are non-convex, it is necessary to consider other possibilities.

Generalizing dual search method

The current implementation uses BundleMethod.jl, but this can be generalized to use other methods (e.g., projected subgradient method).

Cannot use JuDD.jl - MPI and StructJuMPSolverInterface.jl not loading

I have Julia version 1.2.0 and JuMP version 0.19.2. I understand that JuDD.jl is not compatible with this version of JuMP. I tried the following: (but still does not work):

Went to Pkg>
Pkg> activate tutorial

Then the following happened:
(tutorial) pkg> add https://github.com/kibaekkim/BundleMethod.jl.git
Updating git-repo https://github.com/kibaekkim/BundleMethod.jl.git
[ Info: Assigning UUID 03bf20df-d8a0-5746-8978-d623d54f279b to BundleMethod
Updating git-repo https://github.com/kibaekkim/BundleMethod.jl.git
Resolving package versions...
Updating C:\Users\...\tutorial\Project.toml
[03bf20df] + BundleMethod v0.0.0 #master (https://github.com/kibaekkim/BundleMethod.jl.git)
Updating C:\Users\...\tutorial\Manifest.toml
[03bf20df] + BundleMethod v0.0.0 #master (https://github.com/kibaekkim/BundleMethod.jl.git)

(tutorial) pkg> add "https://github.com/kibaekkim/JuDD.jl", rev="mpi"
Cloning git-repo https://github.com/kibaekkim/JuDD.jl
Updating git-repo https://github.com/kibaekkim/JuDD.jl
Updating git-repo https://github.com/kibaekkim/JuDD.jl
Cloning git-repo ,
ERROR: failed to clone from ,, error: GitError(Code:ERROR, Class:Net, unsupported URL protocol)

(tutorial) pkg> add https://github.com/kibaekkim/JuDD.jl
Updating git-repo https://github.com/kibaekkim/JuDD.jl
Updating git-repo https://github.com/kibaekkim/JuDD.jl
Resolving package versions...
ERROR: Unsatisfiable requirements detected for package StructJuMPSolverInterface [7a00d66c]:
StructJuMPSolverInterface [7a00d66c] log:
├─StructJuMPSolverInterface [7a00d66c] has no known versions!
└─restricted to versions * by JuDD [23e7f524] — no versions left
└─JuDD [23e7f524] log:
├─possible versions are: 0.0.0 or uninstalled
└─JuDD [23e7f524] is fixed to version 0.0.0

(tutorial) pkg> add https://github.com/StructJuMP/StructJuMPSolverInterface.jl.git
Updating git-repo https://github.com/StructJuMP/StructJuMPSolverInterface.jl.git
[ Info: Assigning UUID 7a00d66c-3a07-54c1-b1e3-917bd47595ee to StructJuMPSolverInterface
Updating git-repo https://github.com/StructJuMP/StructJuMPSolverInterface.jl.git
Resolving package versions...
Installed MPI ────────────── v0.10.0
Installed MathOptInterface ─ v0.9.1
Updating C:\Users\...\tutorial\Project.toml
[7a00d66c] + StructJuMPSolverInterface v0.0.0 #master (https://github.com/StructJuMP/StructJuMPSolverInterface.jl.git)
Updating C:\Users\...\tutorial\Manifest.toml
[6e4b80f9] + BenchmarkTools v0.4.2
[b6b21f68] + Ipopt v0.6.0
[682c06a0] + JSON v0.21.0
[da04e1cc] + MPI v0.10.0
[b8f27783] + MathOptInterface v0.9.1
[69de0a69] + Parsers v0.3.6
[ae029012] + Requires v0.5.2
[7a00d66c] + StructJuMPSolverInterface v0.0.0 #master (https://github.com/StructJuMP/StructJuMPSolverInterface.jl.git)
Building MPI → C:\Users\...\.julia\packages\MPI\RzbIV\deps\build.log
┌ Error: Error building MPI:
│ ERROR: LoadError: No MPI library found.
│ Ensure an MPI implementation is loaded, or set the JULIA_MPI_PATH variable.
│ Stacktrace:
│ [1] error(::String) at .\error.jl:33
│ [2] top-level scope at C:\Users....julia\packages\MPI\RzbIV\deps\build.jl:42
│ [3] include at .\boot.jl:328 [inlined]
│ [4] include_relative(::Module, ::String) at .\loading.jl:1094
│ [5] include(::Module, ::String) at .\Base.jl:31
│ [6] include(::String) at .\client.jl:431
│ [7] top-level scope at none:5
│ in expression starting at C:\Users....julia\packages\MPI\RzbIV\deps\build.jl:41
└ @ Pkg.Operations C:\cygwin\home\Administrator\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.2\Pkg\src\backwards_compatible_isolation.jl:647

(tutorial) pkg> add https://github.com/JuliaParallel/MPI.jl.git
Cloning git-repo https://github.com/JuliaParallel/MPI.jl.git
Updating git-repo https://github.com/JuliaParallel/MPI.jl.git
Updating git-repo https://github.com/JuliaParallel/MPI.jl.git
Resolving package versions...
Updating C:\Users\...\tutorial\Project.toml
[da04e1cc] + MPI v0.10.0 #master (https://github.com/JuliaParallel/MPI.jl.git)
Updating C:\Users\...\tutorial\Manifest.toml
[da04e1cc] ~ MPI v0.10.0 ⇒ v0.10.0 #master (https://github.com/JuliaParallel/MPI.jl.git)
Building MPI → C:\Users\...\.julia\packages\MPI\Dw1uR\deps\build.log
┌ Error: Error building MPI:
│ ERROR: LoadError: No MPI library found.
│ Ensure an MPI implementation is loaded, or set the JULIA_MPI_PATH variable.
│ Stacktrace:
│ [1] error(::String) at .\error.jl:33
│ [2] top-level scope at C:\Users....julia\packages\MPI\Dw1uR\deps\build.jl:42
│ [3] include at .\boot.jl:328 [inlined]
│ [4] include_relative(::Module, ::String) at .\loading.jl:1094
│ [5] include(::Module, ::String) at .\Base.jl:31
│ [6] include(::String) at .\client.jl:431
│ [7] top-level scope at none:5
│ in expression starting at C:\Users....julia\packages\MPI\Dw1uR\deps\build.jl:41
└ @ Pkg.Operations C:\cygwin\home\Administrator\buildbot\worker\package_win64\build\usr\share\julia\stdlib\v1.2\Pkg\src\backwards_compatible_isolation.jl:647

Finally, MPI.jl and StructJuMPSolverInterface.jl is not loading

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

Query the solution from each sub-problem

Hi,

I was just wondering what is the easiest way to query the solution of the sub-models (meaning the variables at the optimal point), given that the DD instance has been solved.
When I gave it a try I got the following status: OPTIMIZE_NOT_CALLED::TerminationStatusCode = 0

Thanks in advance!

How to Print out value of optimization variables?

Hi authors,

I tried to print out the result of my model after optimization completed. I used the standard form of JuMP, for example, I have variable z in the 1st stage and variable x in the 2nd stage (z[1:J], binary, x[1:I]>=0). The standard syntax in JuMP should be:

x = model[2, :x]
for i in 1:I:
  println(value(x[i]))
end

z = model[1,:z]
for j in 1:J
   println(value(z[j]))
end

However, it ran into the following errors:

┌ Warning: The model has been modified since the last call to `optimize!` (or `optimize!` has not been called yet). If you are iteratively querying solution information and modifying a model, query all the results first, then modify the model.
└ @ JuMP C:\Users\ango1\.julia\packages\JuMP\027Gt\src\optimizer_interface.jl:695
ERROR: OptimizeNotCalled()
Stacktrace:
 [1] get(model::Model, attr::MathOptInterface.VariablePrimal, v::VariableRef)
   @ JuMP C:\Users\ango1\.julia\packages\JuMP\027Gt\src\optimizer_interface.jl:701
 [2] value(v::VariableRef; result::Int64)
   @ JuMP C:\Users\ango1\.julia\packages\JuMP\027Gt\src\variables.jl:1703
 [3] value(v::VariableRef)
   @ JuMP C:\Users\ango1\.julia\packages\JuMP\027Gt\src\variables.jl:1702
 [4] top-level scope
   @ c:\Users\ango1\MyDual\Testing09.jl:160

It would be great if you could show the way to print out the value of decision variables. Thank you!

Attempting to use MPI for non-MPI code

The following message sometimes shows up when running code that does not initialize MPI, and the program is killed:

"Attempting to use an MPI routine after finalizing MPICH"

merge branches

@michel2323 @youngdae Can we merge branches master and julia0.4.7, and merge branches mpi and mpi-julia0.4.7? How about creating directory legacy to put stuff from julia0.4.7, checking VERSION and redirecting to the directory (whenever needed)?

mpi branch not working

With mpi branch, we cannot run with more than a process. If we run with a process, the dual decomposition does not converge.

Deprecated functions in MPI.jl#v0.16.0

Need to address these..

┌ Warning: `Allgatherv(sendbuf, counts::Vector{Cint}, comm::Comm)` is deprecated, use `Allgatherv!(sendbuf, VBuffer(similar(sendbuf, sum(counts)), counts), comm)` instead.
│   caller = allcollect(::Array{DualDecomposition.CouplingVariableKey,1}) at parallel.jl:75
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:75

┌ Warning: `Allgatherv(sendbuf, counts::Vector{Cint}, comm::Comm)` is deprecated, use `Allgatherv!(sendbuf, VBuffer(similar(sendbuf, sum(counts)), counts), comm)` instead.
│   caller = allcollect(::Array{DualDecomposition.CouplingVariableKey,1}) at parallel.jl:76
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:76

┌ Warning: `Allgatherv(sendbuf, counts::Vector{Cint}, comm::Comm)` is deprecated, use `Allgatherv!(sendbuf, VBuffer(similar(sendbuf, sum(counts)), counts), comm)` instead.
│   caller = combine_dict(::Dict{Int64,Float64}) at parallel.jl:104
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:104

┌ Warning: `Gatherv(sendbuf::AbstractArray, counts::Vector{Cint}, root::Integer, comm::Comm)` is deprecated, use `Gatherv!(view(sendbuf, 1:counts[MPI.Comm_rank(comm) + 1]), if Comm_rank(comm) == root
│         VBuffer(similar(sendbuf, sum(counts)), counts)
│     else
│         nothing
│     end, root, comm)` instead.
│   caller = combine_dict(::Dict{Int64,Float64}) at parallel.jl:105
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:105

┌ Warning: `Gatherv(sendbuf::AbstractArray, counts::Vector{Cint}, root::Integer, comm::Comm)` is deprecated, use `Gatherv!(view(sendbuf, 1:counts[MPI.Comm_rank(comm) + 1]), if Comm_rank(comm) == root
│         VBuffer(similar(sendbuf, sum(counts)), counts)
│     else
│         nothing
│     end, root, comm)` instead.
│   caller = combine_dict(::Dict{Int64,Float64}) at parallel.jl:106
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:106

┌ Warning: `Allgatherv(sendbuf, counts::Vector{Cint}, comm::Comm)` is deprecated, use `Allgatherv!(sendbuf, VBuffer(similar(sendbuf, sum(counts)), counts), comm)` instead.
│   caller = combine_dict(::Dict{Int64,SparseArrays.SparseVector{Float64,Ti} where Ti<:Integer}) at parallel.jl:124
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:124

┌ Warning: `Gatherv(sendbuf::AbstractArray, counts::Vector{Cint}, root::Integer, comm::Comm)` is deprecated, use `Gatherv!(view(sendbuf, 1:counts[MPI.Comm_rank(comm) + 1]), if Comm_rank(comm) == root
│         VBuffer(similar(sendbuf, sum(counts)), counts)
│     else
│         nothing
│     end, root, comm)` instead.
│   caller = combine_dict(::Dict{Int64,SparseArrays.SparseVector{Float64,Ti} where Ti<:Integer}) at parallel.jl:125
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:125

┌ Warning: `Allgatherv(sendbuf, counts::Vector{Cint}, comm::Comm)` is deprecated, use `Allgatherv!(sendbuf, VBuffer(similar(sendbuf, sum(counts)), counts), comm)` instead.
│   caller = collect(::Array{SparseArrays.SparseVector{Float64,Ti} where Ti<:Integer,1}) at parallel.jl:86
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:86

┌ Warning: `Gatherv(sendbuf::AbstractArray, counts::Vector{Cint}, root::Integer, comm::Comm)` is deprecated, use `Gatherv!(view(sendbuf, 1:counts[MPI.Comm_rank(comm) + 1]), if Comm_rank(comm) == root
│         VBuffer(similar(sendbuf, sum(counts)), counts)
│     else
│         nothing
│     end, root, comm)` instead.
│   caller = collect(::Array{SparseArrays.SparseVector{Float64,Ti} where Ti<:Integer,1}) at parallel.jl:87
└ @ DualDecomposition.parallel ~/REPOS/DualDecomposition.jl/src/parallel.jl:87

Fix ADMM

While updating the package, I have moved the ADMM part to a separate branch. @youngdae please feel free to update the method and make a PR.

CouplingVariableRef error

Hi author,

My model is quite similar to an example "sslp.jl" in this repository (has only "x" variable in 1st stage), but there are two variables in 1st stage:
@variable(model, x[1:J], Bin)
@variable(mode, y>=0) #Only 1 variable

I tried to "push!(coupling_variables, DD.CouplingVariableRef(s, model[:y])), but it ran into error below:

ERROR: MethodError: no method matching DualDecomposition.CouplingVariableRef(::Int64, ::VariableRef)

Closest candidates are:
DualDecomposition.CouplingVariableRef(::Int64, ::Any, ::VariableRef)
@ DualDecomposition C:\Users\ango1.julia\packages\DualDecomposition\rqrRm\src\BlockModel.jl:25
DualDecomposition.CouplingVariableRef(::DualDecomposition.CouplingVariableKey, ::VariableRef)
@ DualDecomposition C:\Users\ango1.julia\packages\DualDecomposition\rqrRm\src\BlockModel.jl:20

Stacktrace:
[1] top-level scope
@ c:\Users\ango1\MyDSPOpt\Testing05.jl:129


I understand there should be at least two variables to push, but If I dont use "push", how do the model determine variable "y" belongs to the first stage? Hope to hear from you. Thanks.

Potential to incorporate BilevelJuMP instances

Hi,
Currently, I am trying to solve a consensus-finding problem among a set of bi-level optimization problems (implemented via https://github.com/joaquimg/BilevelJuMP.jl). For this, the use of the DualDecomposition package would be particularly interesting as not many distributed algorithms can deal with the non-convex NLP subproblems.
However, currently, if the subproblems are formulated with BilevelJuMP the package raises an error:

ERROR: MethodError: Cannot `convert` an object of type BilevelModel to an object of type Model

Do you think the package could be seamlessly extended to such model instances?

Thanks in advance for the response!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.