Comments (7)
Seems like it is working! Thanks.
from optimization.jl.
Fixed for me as well. Thanks!
from optimization.jl.
MOI can use symbolic information, and we should use it when we can. The best solution here is proably to try/catch then warn, and if it warns just have it omit the symbolic information. That will be slower but will make more code work (with an appropriate performance warning)
from optimization.jl.
In a similar spirit,
Describe the example
OptimizationMOI from 0.1.16 to 0.2.0 introduces the use of Symbolics. Symbolics.Num is not compatible with Distributions.jl argument check.
Minimal Reproducible Example 👇
using Distributions
using Ipopt
using Optimization, OptimizationMOI
d = 1
T = 400
y = sin.(2π/T*1:T) + cos.(2π/T*1:T) .^ 2 .* rand(Gamma(), T)
f(θ) = Gamma(θ[1], θ[2])
f(t, θ) = f([cos(2π / T * t * θ[1])^2, cos(2π / T * t * θ[2])^2])
θσ10 = zeros(2d + 1)
θσ20 = zeros(2d + 1)
θ0 = hcat(θσ10, θσ20)
ℓ(θ, x) = -sum(logpdf(f(t / T, θ), x[t]) for t in eachindex(x)) # = -loglikelihood
OptFunc = OptimizationFunction(ℓ, Optimization.AutoForwardDiff())
prob = OptimizationProblem(OptFunc, vec(θ0), y)
sol = solve(prob, Ipopt.Optimizer())
Error & Stacktrace
ERROR: TypeError: non-boolean (Symbolics.Num) used in boolean context
Stacktrace:
[1] #271
@ Distributions C:\Users\metivier\.julia\packages\Distributions\UaWBm\src\univariate\continuous\gamma.jl:34 [inlined]
[2] check_args
@ Distributions C:\Users\metivier\.julia\packages\Distributions\UaWBm\src\utils.jl:89 [inlined]
[3] #Gamma#270
@ Distributions C:\Users\metivier\.julia\packages\Distributions\UaWBm\src\univariate\continuous\gamma.jl:34 [inlined]
[4] Gamma
@ Distributions C:\Users\metivier\.julia\packages\Distributions\UaWBm\src\univariate\continuous\gamma.jl:33 [inlined]
[5] f(θ::Vector{Symbolics.Num})
@ Main .\Untitled-1:10
[6] f(t::Float64, θ::Vector{Symbolics.Num})
@ Main .\Untitled-1:11
[7] (::var"#5#6"{Vector{Symbolics.Num}, Vector{Symbolics.Num}})(t::Int64)
@ Main .\none:0
[8] (::Base.MappingRF)(acc::Any, x::Any)
@ Base .\reduce.jl:100 [inlined]
[9] _foldl_impl(op::Base.MappingRF{var"#5#6"{…}, Base.BottomRF{…}}, init::Base._InitialValue, itr::Base.OneTo{Int64})
@ Base .\reduce.jl:58
[10] foldl_impl
@ .\reduce.jl:48 [inlined]
[11] mapfoldl_impl
@ .\reduce.jl:44 [inlined]
[12] mapfoldl
@ .\reduce.jl:175 [inlined]
[13] mapreduce
@ .\reduce.jl:307 [inlined]
[14] sum
@ .\reduce.jl:535 [inlined]
[15] sum
@ .\reduce.jl:564 [inlined]
[16] ℓ(θ::Vector{Symbolics.Num}, x::Vector{Symbolics.Num})
@ Main .\Untitled-1:18
[17] OptimizationFunction
@ C:\Users\metivier\.julia\packages\SciMLBase\dpafx\src\scimlfunctions.jl:3811 [inlined]
[18] modelingtoolkitize(prob::OptimizationProblem{…}; kwargs::@Kwargs{})
@ ModelingToolkit C:\Users\metivier\.julia\packages\ModelingToolkit\Gpzyo\src\systems\optimization\modelingtoolkitize.jl:19
[19] modelingtoolkitize
@ C:\Users\metivier\.julia\packages\ModelingToolkit\Gpzyo\src\systems\optimization\modelingtoolkitize.jl:6 [inlined]
[20] OptimizationMOI.MOIOptimizationNLPCache(prob::OptimizationProblem{…}, opt::Ipopt.Optimizer; callback::Nothing, kwargs::@Kwargs{…})
@ OptimizationMOI C:\Users\metivier\.julia\packages\OptimizationMOI\HC25i\src\nlp.jl:143
[21] MOIOptimizationNLPCache
@ C:\Users\metivier\.julia\packages\OptimizationMOI\HC25i\src\nlp.jl:108 [inlined]
[22] #__init#38
@ C:\Users\metivier\.julia\packages\OptimizationMOI\HC25i\src\OptimizationMOI.jl:278 [inlined]
[23] __init
@ C:\Users\metivier\.julia\packages\OptimizationMOI\HC25i\src\OptimizationMOI.jl:270 [inlined]
[24] #init#598
@ C:\Users\metivier\.julia\packages\SciMLBase\dpafx\src\solve.jl:165 [inlined]
[25] init
@ C:\Users\metivier\.julia\packages\SciMLBase\dpafx\src\solve.jl:163 [inlined]
[26] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer; kwargs::@Kwargs{})
@ SciMLBase C:\Users\metivier\.julia\packages\SciMLBase\dpafx\src\solve.jl:96
[27] solve(::OptimizationProblem{…}, ::Ipopt.Optimizer)
@ SciMLBase C:\Users\metivier\.julia\packages\SciMLBase\dpafx\src\solve.jl:93
[28] top-level scope
@ Untitled-1:22
Some type information was truncated. Use `show(err)` to see complete types.
Not Working Environment (please complete the following information):
[7f7a1694] Optimization v3.21.1
⌃ [fd9f6733] OptimizationMOI v0.3.1
Working Environment (please complete the following information):
[7f7a1694] Optimization v3.21.1
⌃ [fd9f6733] OptimizationMOI v0.1.16
Actually this MWE run with these versions but the solver fail (not error), maybe the MWE is ill posed. In my full problem, it worked.
from optimization.jl.
It should be fixed in the latest release which I created yesterday, try it out and let me know if it doesn't work!
from optimization.jl.
I'm not sure about having it on by default without a time cap. There are many cases that would just stall which would otherwise work.
from optimization.jl.
It's off by default now
from optimization.jl.
Related Issues (20)
- Downstream Compat bumps
- Error with LBFGS() with using adjoint on 3D arrays
- Error when `store_trace=true` with OptimizationEvolutionary.jl
- Issue in running OptimizationFunction HOT 13
- Latest SciMLBase + Optimization breaks precompile HOT 3
- TypeError: in keyword argument linesearch, expected Function HOT 8
- Optimiztion.jl does not precompile HOT 11
- Add trait for checking if OptimizationFunction is used for derivative based optimizers
- `MethodError: objects of type Nothing are not callable` when `lb` and `ub` are used with `NelderMead` HOT 9
- PolyesterForwardDiff not loading HOT 3
- No documentation for latest release HOT 1
- PRIMA lib errors with `AutoForwardDiff` HOT 3
- Support LBFGSB.jl HOT 12
- Include `searchdirection` in `OptimizationState` HOT 2
- The `callback` appears to be called for linesearch iterations HOT 1
- `PolyOpt` only accept functions without any extra inputs HOT 1
- US spelling preferred? HOT 1
- Augmented Lagrangian HOT 5
- Multithreading support for Optimizers like BBO
- Is there currently a feasible way to use NamedTuple or ComponentArray as x0 for Optimization.jl HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optimization.jl.