Giter Club home page Giter Club logo

Comments (9)

rafaqz avatar rafaqz commented on September 22, 2024 1

You should use SAMIN https://julianlsolvers.github.io/Optim.jl/stable/algo/samin/ if your objective is not differentiable

FYI SAMIN doesnt support box constraints. oops thats SimulatedAnnealing() SAMIN() seems to be working!!

from optimization.jl.

rafaqz avatar rafaqz commented on September 22, 2024

Ok here's a MWE with rosenbrock:

julia> rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
rosenbrock (generic function with 2 methods)

julia> x0 = zeros(2)
2-element Vector{Float64}:
 0.0
 0.0

julia> lb = fill(-1.0, 2)
2-element Vector{Float64}:
 -1.0
 -1.0

julia> ub = fill(200.0, 2)
2-element Vector{Float64}:
 200.0
 200.0

julia> _p = [1.0, 100.0]
2-element Vector{Float64}:
   1.0
 100.0

julia> 

julia> l1 = rosenbrock(x0, _p)
1.0

julia> prob = OptimizationProblem(rosenbrock, x0, _p; lb, ub)
OptimizationProblem. In-place: true
u0: 2-element Vector{Float64}:
 0.0
 0.0

julia> solve(prob, NelderMead())
ERROR: MethodError: objects of type Nothing are not callable
Stacktrace:
  [1] (::OptimizationOptimJL.var"#20#24"{OptimizationCache{}})(G::Vector{Float64}, θ::Vector{Float64})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/yMF3E/src/OptimizationOptimJL.jl:291
  [2] gradient!!(obj::OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, x::Vector{Float64})
    @ NLSolversBase ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:63
  [3] gradient!
    @ ~/.julia/packages/NLSolversBase/kavn7/src/interface.jl:51 [inlined]
  [4] gradient!(obj::Optim.BarrierWrapper{OnceDifferentiable{Float64, Vector{Float64}, Vector{Float64}}, Optim.BoxBarrier{Vector{Float64}, Vector{Float64}}, Float64, Float64, Vector{Float64}}, x::Vector{Float
64})
    @ Optim ~/.julia/packages/Optim/EJwLF/src/multivariate/solvers/constrained/fminbox.jl:118
  [5] optimize(df::OnceDifferentiable{…}, l::Vector{…}, u::Vector{…}, initial_x::Vector{…}, F::Fminbox{…}, options::Optim.Options{…})
    @ Optim ~/.julia/packages/Optim/EJwLF/src/multivariate/solvers/constrained/fminbox.jl:327
  [6] __solve(cache::OptimizationCache{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/yMF3E/src/OptimizationOptimJL.jl:306
  [7] solve!
    @ ~/.julia/packages/SciMLBase/wVDwN/src/solve.jl:179 [inlined]
  [8] #solve#596
    @ ~/.julia/packages/SciMLBase/wVDwN/src/solve.jl:96 [inlined]
  [9] solve(::OptimizationProblem{…}, ::NelderMead{…})
    @ SciMLBase ~/.julia/packages/SciMLBase/wVDwN/src/solve.jl:93
 [10] top-level scope
    @ REPL[104]:1
Some type information was truncated. Use `show(err)` to see complete types.

from optimization.jl.

Vaibhavdixit02 avatar Vaibhavdixit02 commented on September 22, 2024

This is from #558. The fallback for handling box constraints when a solver doesn't support it directly is to wrap it in Fminbox which requires gradients. You can work around this by passing a OptimizationFunction with AD backend for now, we should update the wrapping as mentioned in the linked issue

from optimization.jl.

rafaqz avatar rafaqz commented on September 22, 2024

A fix to the MWE above might help make this clearer?

Im not sure what you mean "passing a OptimizationFunction with AD backend"

Edit I guess you mean something like:

using SciMLBase, Optimization
julia> prob = OptimizationProblem(OptimizationFunction(rosenbrock, SciMLBase.NoAD()), x0, _p; lb, ub)

No... that doesn't work either! Surely NoAD() should mean dont call f.grad ??

My actual problems are not differentiable so NoAD() is really what I need.

from optimization.jl.

rafaqz avatar rafaqz commented on September 22, 2024

This works fine just using Optim directly:

using Optim
rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
lower, upper = [-100, -100], [100, 100]
result = optimize(rosenbrock, lower, upper, zeros(2), NelderMead())

Also works with Fminbox

result = optimize(rosenbrock, lower, upper, zeros(2), Fminbox(NelderMead()))

from optimization.jl.

Vaibhavdixit02 avatar Vaibhavdixit02 commented on September 22, 2024

No... that doesn't work either! Surely NoAD() should mean dont call f.grad ??

I can see how the name might have confused you but it is intended (and is not part of the public API just used in the internals) to mean that derivatives are provided and not use AD for generating them.

Im not sure what you mean "passing a OptimizationFunction with AD backend"

I meant a valid AD backend from https://docs.sciml.ai/Optimization/stable/API/ad/

The reason is for this error is still #558

You should use SAMIN https://julianlsolvers.github.io/Optim.jl/stable/algo/samin/ if your objective is not differentiable

from optimization.jl.

rafaqz avatar rafaqz commented on September 22, 2024

Thanks. I also found that it works with the NLopt nelder-mead,

But a fix of this would be good... it does work fine in Optim.jl, even if you use Fminbox(). So switching from Optim to Optimization this is just the first thing I would try to test a derivative free optimisation.

from optimization.jl.

Vaibhavdixit02 avatar Vaibhavdixit02 commented on September 22, 2024

The point is not that it doesn't work with Optim, in Optim the gradients are automatically calculated if the method needs it https://github.com/JuliaNLSolvers/Optim.jl/blob/01b9391383c562078f3ada4d3c5aeee76af6ca8a/src/multivariate/solvers/constrained/fminbox.jl#L83 - always with ForwardDiff, in Optimization.jl you need to specify the AD library you want to use, so the right way to use it is provide a valid AD choice there. The issue is that the error should be better there's work in progress for that already by @ParasPuneetSingh, hopefully, we can finish that up pretty soon and do a release.

from optimization.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.