Giter Club home page Giter Club logo

optimizationbase.jl's Introduction

OptimizationBase

The base package for Optimization.jl, containing the structs and basic functions utilized there. Specifically related to AD, caching, etc.

Build Status

optimizationbase.jl's People

Contributors

vaibhavdixit02 avatar chrisrackauckas avatar dependabot[bot] avatar

Stargazers

 avatar Orjan Ameye avatar  avatar Elias Carvalho avatar Guillaume Dalle avatar Tejaswee Sulekh avatar Samuel Belko avatar Jonathan Fischer avatar

Watchers

David Widmann avatar  avatar  avatar  avatar

optimizationbase.jl's Issues

Accessing objective's gradient for MOI (and others) in the solution type

While thinking about SciML/Optimization.jl#8 and SciML/Optimization.jl#148 in order to finalize the output structure, it is often very useful to know the gradient of the objective (and the constraint jacobian) at the solution. I think this means that you would want to have the gradients in the solution type (which I think is in https://github.com/SciML/SciMLBase.jl/blob/master/src/solutions/optimization_solutions.jl ?)

In MOI this is done with the

MOI.eval_objective_gradient

call, I believe, which you could use to fill it in You can see https://github.com/jump-dev/Ipopt.jl/blob/master/src/MOI_wrapper.jl for an example of its implementation which would show how it is called. For things like JuMP, then call these functions as required to fill in their own problem structures.

An aside: for the AD rules here, all of those gradients and lagrange multipliers are essential to keep around in one form or another - so having them in the structure might have the ChainRules definitions a little bit easier to implement. An example is implementing the envelope condition to provide the AD rule for constrained optimization.


I think this request is part of a more general issue that there may be all sorts of things in the return type from optimizers which you want to standardize access to over time.

One approach is to have lots of fields which may be nothing, and potentially start subclassing it (e.g. OptimizationSolution, ConstrainedOptimizationSolution, ComplementarityConstrainedOptimizationSolution, etc. as required). For what it is worth, the MOI crew decided not to have a specific structure for this and instead implement functions and use dispatching. But the goals of the MOI are very different (ie.. it is focused on providing the modeling language features to support things like JuMP as its main goal). I am not sure how well that approach would work for AD rules, so not suggesting it here.

Anyways, not suggesting any particular design, just that these are all going to come - especially as you add in more on constrained optimization. All of the numbers start mattering, and if the values are lost then people might have to go back to using the raw optimizers without GalacticOptim.

The dual as well becomes important for many cases as the values of the lagrange multipliers have important interpretation in many cases.

add PolyesterForwardDiff as an AD backend

Please add PolyesterForwardDiff as an AD backed for Optimization. I'm looking to use Optimization.AutoPolyesterForwardDiff() as a drop-in replacement for Optimization.AutoForwardDiff() when solving say an OptimizationProblem with the AD backed specified as argument in the OptimizationFunction call.

TagBot trigger issue

This issue is used to trigger TagBot; feel free to unsubscribe.

If you haven't already, you should update your TagBot.yml to include issue comment triggers.
Please see this post on Discourse for instructions and more details.

If you'd like for me to do this for you, comment TagBot fix on this issue.
I'll open a PR within a few hours, please be patient!

`specialize` does not seem to be supported in OptimizationFunction

The docs for OptimizationFunction say there is a option specialize, https://github.com/SciML/SciMLBase.jl/blob/e9caa29e90bf46d8a00dfec03d78158eb0762566/src/scimlfunctions.jl#L2085-L2087

## specialize: Controlling Compilation and Specialization

For more details on this argument, see the ODEFunction documentation.

However, the definition of OptimizationFunction and AbstractOptimizationFunction don't include this field:
https://github.com/SciML/SciMLBase.jl/blob/e9caa29e90bf46d8a00dfec03d78158eb0762566/src/SciMLBase.jl#L590

abstract type AbstractOptimizationFunction{iip} <: AbstractSciMLFunction{iip} end

https://github.com/SciML/SciMLBase.jl/blob/e9caa29e90bf46d8a00dfec03d78158eb0762566/src/scimlfunctions.jl#L2093-L2096

struct OptimizationFunction{iip, AD, F, G, H, HV, C, CJ, CH, LH, HP, CJP, CHP, LHP, S, S2,
                            O, HCV,
                            CJCV,
                            CHCV, LHCV, EX, CEX, SYS} <: AbstractOptimizationFunction{iip}

instantiate_function should not populate hessians etc. unless needed

The following code hangs for me:

using Optimization;
using ModelingToolkit

n = 100
u = zeros(2n)

obj(u, _) = sum(u.^2)

function cons(u, _)
    vcat(u[1:n] - u[n+1:2n], u[1:n] + u[n+1:2n])
end

func = OptimizationFunction{false}(obj, Optimization.AutoModelingToolkit(); cons=cons)

Optimization.instantiate_function(func, u, Optimization.AutoModelingToolkit(), SciMLBase.NullParameters())

It's because ModelingToolkit is trying to compute the Hessian of the constraints function, and does so whenever the field is not set, even when instantiate_function is called by a first-order method.

This is fixed by setting the kwarg cons_h = false in func, to indicate that we do not want the Hessian, but I thought it's still worth an issue as this could be troublesome default behaviour / merit some documentation. It could be nice, for example, for first-order methods to tell the instantiate function method that they do not need the Hessian

Does Optimization.jl avoid multiple forwad pass evaluations?

Hi,

Optim.jl offers the fg! mechanism to save some work by avoiding multiple forward pass evaluations:

    function fg!(F, G, rec)
        # Zygote calculates both derivative and loss, therefore do everything in one step
        if G !== nothing
            y, grad = Zygote.withgradient(loss, rec)
            G .= grad
            if F !== nothing
                return y
            end
        end
        if F !== nothing
            return loss(rec)
        end

I couldn't find any code here to see if Optimization.jl does it similarly?

Best,

Felix

Interface to add linear and complementarity constraints

@ChrisRackauckas and @mohamed82008

To summarize the conversation: just to verify here in case I can get an undergrad to help out. Are you saying that https://github.com/SciML/SciMLBase.jl/blob/5796b966ac13a6950dd956d7ff9ce5941ee5fc77/src/problems/basic_problems.jl#L96-L105 and the problem are unchanged. And that the OptimizationFunction.cons and OptimizationProblem.lcons etc. are more complicated composite types in that case?

For example

struct SemiLinearFunction{F1,F2}
    f_nonlinear::F1
    f_linear::F2
    complementarity_flags_nonlinear::CF
    complementarity_flags_linear::CF
end

which then the backends need to check for (i.e. some is_semi_linear(func.cons) and then if it is use OptimizationFunction.cons.f_nonlinear where it previously it just used OptimizationFunction.cons and then it can use the .f_linear to pass that matrix to optimizers that support linear constraints (e.g. anything MOI, Knitro, etc.)

I added complementarity_flags here because I think you want all stuff disconnected from values in the OptimizationFunction interface - and complementarity conditions shouldn't change.

Then it is necessary to separate out the lcons and ucons into composite types as well. Maybe

struct SemiLinearConstraint{C1, C2, CF}
   c_nonlinear::C1
   c_linear::C2
end

And in the OptimizationProblem constructor it would have users pass in a SemiLinearConstraint object for the lcons and ucons types if they use the SemiLinearFunction function when constructing the OptimizationFunction.

Then the user would do the following:

rosenbrock(x, p) =  (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2
nonlinear_cons= (x,p) -> [x[1]^2 + x[2]^2]
linear_cons = A # e.g. some  3x2 matrix for example.  Could add complementarity stuff, but will avoid for now.
optprob = OptimizationFunction(rosenbrock, GalacticOptim.AutoForwardDiff();cons= SemiLinearFunction(nonlinear_cons, linear_cons))
lcons = SemiLinearConstraint([-Inf, 0], [1.0, -Inf, 0.1])  # nonlinear then linear
ucons = SemiLinearConstraint([-Inf, 500], [1.0, Inf, 10.0]) 
lb = [-500.0,-500.0]
ub=[50.0,50.0]
prob = OptimizationProblem(optprob, x0; lcons, ucons, lb, ub)

Or whatever... I might have conformance wrong. Do I have that right?

It seems like there might be a lot of changes within the actual code which goes through the OptimizationFunction because it will need to sometimes work with cons directly and sometimes work with cons.f_nonlinear if specified.
For example,

Anyways, lots of changes all over the place.


Just to confirm: the alternative might be far easier to write for users (and for backend implementations), and is a little more consistent with standard interfaces. But that doesn't mean it is a good idea.

But to summarize that alternative: add the new stuff to the OptimizationFunctions (sorry, that was a typo, but I do think OptimizationFunctions makes more sense than OptimizationFunction)

struct OptimizationFunction{iip,AD,F,G,H,HV,C,CJ,CH,LC,CF,CFL} <: AbstractOptimizationFunction{iip}
    f::F
    adtype::AD
    grad::G
    hess::H
    hv::HV
    cons::C
    cons_j::CJ
    cons_h::CH
    linear_constraint::LC  
    nonlinear_complementarity_flags::CF
    linear_complementarity_flags::CFL
end

Then the optimization problem is adds the left and right vectors for the linear constraints to be consistent with the lcons and ucons values. e.g.

struct OptimizationProblem{iip,F,uType,P,B,LC,UC,S,K} <: AbstractOptimizationProblem{isinplace}
    f::F
    u0::uType
    p::P
    lb::B
    ub::B
    lcons::LC
    ucons::UC
    lcons_linear::LCL
    ucons_linear::UCL
    sense::S
    kwargs::K
    @add_kwonly function OptimizationProblem{iip}(f::OptimizationFunction{iip}, u0, p=NullParameters();
                                                  lb = nothing, ub = nothing,
                                                  lcons = nothing, ucons = nothing,
                                                  lcons_linear = nothing, ucons_linear = nothing,
                                                  sense = nothing, kwargs...) where iip
        new{iip, typeof(f), typeof(u0), typeof(p),
            typeof(lb), typeof(lcons), typeof(ucons),
            typeof(sense), typeof(kwargs)}(f, u0, p, lb, ub, lcons, ucons, sense, kwargs)
    end
end

With that method, all backends need to do is access the underlying fields and maybe throw an error if those things are not supported. e.g. not every optimizer supports linear constraints directly, and not all of them support nonlinear_complementarity_flags != nothing.

For example, in the case of MOI, nothing changes except that it can look to see if lcons_linear and OptimizationFunction.cons_linear are != nothing. If they are, then it adds those linear constraints.

For things like Knitro, this is easy (i.e. call https://github.com/jump-dev/KNITRO.jl/blob/master/src/kn_constraints.jl#L223-L239 I believe).

There are many other useful constraint types which optimizers can exploit - such as Cone constraints. But I think that linear ones take care of a lot of the main uses. However, once you do more elaborate model reduction within ModelingToolkit/etc., you may identify cone constraints and want to pass those diretly to the optimizers.

Out of place not supported

The docs mention an out-of-place option, but it seems like user-provided gradients, etc. are all assumed in-place at the moment. For example,

using Optimization
using NLopt
using OptimizationNLopt

obj(x, p) = (x[1]-2)^2
grad(x, p) = 2*(x[1]-2)
func = OptimizationFunction{false}(obj; grad=grad)
prob = OptimizationProblem(func, [3])
solve(prob, NLopt.LD_CCSAQ(); maxiters=100)

gives

solve(prob, NLopt.LD_CCSAQ(); maxiters=100)
u: 1-element Vector{Float64}:
 3.0

when the correct answer should be 2.

Out-of-place forms of functions

The docs https://docs.sciml.ai/Optimization/stable/API/optimization_function/ mention that the provided forms for grad, hess, hv, cons, cons_j, cons_h, and lag_h can be provided as an in-place form e.g. cons(res, x, p), or an out-of-place form e.g. cons(x, p). This doesn't seem to be the case, e.g.

using Optimization
using OptimizationOptimJL
rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
x0 = zeros(2)
_p = [1.0, 100.0]
cons = (x, p) -> [x[1]^2 + x[2]^2]
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = cons)
prob = OptimizationProblem(optf, x0, _p, lcons = [-5.0], ucons = [10.0])
sol = solve(prob, IPNewton()) 
ERROR: MethodError: no method matching (::var"#11#12")(::Vector{Float64}, ::Vector{Float64}, ::Vector{Float64})

Some of the types have been truncated in the stacktrace for improved reading. To emit complete information
in the stack trace, evaluate `TruncatedStacktraces.VERBOSE[] = true` and re-run the code.

Closest candidates are:
  (::var"#11#12")(::Any, ::Any) at Untitled-1:7
Stacktrace:
  [1] (::Optimization.var"#97#114"{OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}, Vector{Float64}})(res::Vector{Float64}, θ::Vector{Float64})
    @ Optimization C:\Users\User\.julia\packages\Optimization\XjqVZ\src\function\forwarddiff.jl:78
  [2] initial_state(method::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, options::Optim.Options{Float64, OptimizationOptimJL.var"#_cb#40"{OptimizationOptimJL.var"#38#47", Base.Iterators.Cycle{Tuple{Optimization.NullData}}}}, d::TwiceDifferentiable{Float64, Vector{Float64}, Matrix{Float64}, Vector{Float64}}, constraints::TwiceDifferentiableConstraints{Optimization.var"#97#114"{OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}, Vector{Float64}}, Optimization.var"#99#116"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#98#115"{Int64}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#98#115"{Int64}, Float64}, Float64, 2}}}}, OptimizationOptimJL.var"#36#45"{OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}}, Float64}, initial_x::Vector{Float64})
    @ Optim C:\Users\User\.julia\packages\Optim\tP8PJ\src\multivariate\solvers\constrained\ipnewton\ipnewton.jl:114
  [3] optimize(d::TwiceDifferentiable{Float64, Vector{Float64}, Matrix{Float64}, Vector{Float64}}, constraints::TwiceDifferentiableConstraints{Optimization.var"#97#114"{OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}, Vector{Float64}}, Optimization.var"#99#116"{ForwardDiff.JacobianConfig{ForwardDiff.Tag{Optimization.var"#98#115"{Int64}, Float64}, Float64, 2, Vector{ForwardDiff.Dual{ForwardDiff.Tag{Optimization.var"#98#115"{Int64}, Float64}, Float64, 2}}}}, OptimizationOptimJL.var"#36#45"{OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}}, Float64}, initial_x::Vector{Float64}, method::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, options::Optim.Options{Float64, OptimizationOptimJL.var"#_cb#40"{OptimizationOptimJL.var"#38#47", Base.Iterators.Cycle{Tuple{Optimization.NullData}}}})
    @ Optim C:\Users\User\.julia\packages\Optim\tP8PJ\src\multivariate\solvers\constrained\ipnewton\interior.jl:229
  [4] ___solve(prob::OptimizationProblem{true, OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, opt::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, data::Base.Iterators.Cycle{Tuple{Optimization.NullData}}; callback::Function, maxiters::Nothing, maxtime::Nothing, abstol::Nothing, reltol::Nothing, progress::Bool, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ OptimizationOptimJL C:\Users\User\.julia\packages\OptimizationOptimJL\WqQOV\src\OptimizationOptimJL.jl:346
  [5] ___solve
    @ C:\Users\User\.julia\packages\OptimizationOptimJL\WqQOV\src\OptimizationOptimJL.jl:252 [inlined]
  [6] #__solve#2
    @ C:\Users\User\.julia\packages\OptimizationOptimJL\WqQOV\src\OptimizationOptimJL.jl:67 [inlined]
  [7] __solve (repeats 2 times)
    @ C:\Users\User\.julia\packages\OptimizationOptimJL\WqQOV\src\OptimizationOptimJL.jl:50 [inlined]
  [8] #solve#552
    @ C:\Users\User\.julia\packages\SciMLBase\gTrkJ\src\solve.jl:85 [inlined]
  [9] solve(::OptimizationProblem{true, OptimizationFunction{true,Optimization.AutoForwardDiff{nothing},}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Vector{Float64}, Vector{Float64}, Nothing, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}}, ::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol})
    @ SciMLBase C:\Users\User\.julia\packages\SciMLBase\gTrkJ\src\solve.jl:79
 [10] top-level scope
    @ Untitled-1:10

Looking at the instantiate_function code, it doesn't seem like this is meant to be supported, e.g. here in https://github.com/SciML/Optimization.jl/blob/master/src/function/finitediff.jl:

https://github.com/SciML/Optimization.jl/blob/master/src/function/finitediff.jl#L87-L91

No check is made for this. I don't think it would be too hard to correct this, e.g. the code above could, I think, be something like

if f.cons === nothing 
    cons = nothing 
else 
    if SciMLBase.isinplace(f.cons, 3)
        cons = (res, θ) -> f.cons(res, θ, p)
    else 
        cons = (res, θ) -> cons .= f.cons(θ, p)
    end
end

Similar problems e.g. in https://github.com/SciML/Optimization.jl/blob/master/src/function/forwarddiff.jl.

Static arrays + autodiff

I tried using static vectors for parameters, which works nicely. But not when combined with autodiff:

julia> using Optimization, OptimizationOptimJL, StaticArrays, ForwardDiff

# don't specify inplace/outofplace:

julia> of = OptimizationFunction((x, p) -> sum(x), Optimization.AutoForwardDiff())
julia> prob = OptimizationProblem(of, SVector(0., 0.), nothing)
julia> solve(prob, Optim.GradientDescent())
ERROR: setindex!(::SVector{2, Float64}, value, ::Int) is not defined.

# specify out of place:

julia> of = OptimizationFunction{false}((x, p) -> sum(x), Optimization.AutoForwardDiff())
julia> prob = OptimizationProblem(of, SVector(0., 0.), nothing)
julia> solve(prob, Optim.GradientDescent())
ERROR: Use OptimizationFunction to pass the derivatives or automatically generate them with one of the autodiff backends

Is this expected?

Add `sense` support for all algorithms

Currently sense is only supported for a subset of algorithms. This breaks the common interface as some solvers solve min vs max. Furthermore, since sense is a kwarg for OptimizationProblem and not solve, the interface and docs give the impression that it is supported for all algorithms.

using Optimization, OptimizationBBO

J(x,p) = x

F = OptimizationFunction(J)
prob = Optimization.OptimizationProblem(F, 0.0; lb = -10, ub =10, sense = MaxSense)
solve(prob, BBO_adaptive_de_rand_1_bin_radiuslimited()).objective # -10.0

OptimizationMTKExt.jl gets re-precompiled

Describe the bug 🐞

OptimizationMTKExt.jl gets precompiled repeatedly when starting a new Julia session and loading OptimizationOptimJL and ModelingToolkit.

Discussed in https://discourse.julialang.org/t/failing-to-precompile-optimizationbase/114926/4 .

Expected behavior

Should precompile OptimizationMTKExt.jl only once.

Minimal Reproducible Example 👇

using OptimizationOptimJL
using ModelingToolkit

Error & Stacktrace ⚠️

Precompiling OptimizationMTKExt
        Info Given OptimizationMTKExt was explicitly requested, output will be shown live
WARNING: Method definition AutoModelingToolkit() in module ADTypes at deprecated.jl:103 overwritten in module OptimizationMTKExt at C:\Users\jaakkor2\MyTemp\mtkopt\packages\OptimizationBase\QZlI6\ext\OptimizationMTKExt.jl:9.
ERROR: Method overwriting is not permitted during Module precompilation. Use `__precompile__(false)` to opt-out of precompilation.
  ? OptimizationBase  OptimizationMTKExt
[ Info: Precompiling OptimizationMTKExt [ead85033-3460-5ce4-9d4b-429d76e53be9]
WARNING: Method definition AutoModelingToolkit() in module ADTypes at deprecated.jl:103 overwritten in module OptimizationMTKExt at C:\Users\jaakkor2\MyTemp\mtkopt\packages\OptimizationBase\QZlI6\ext\OptimizationMTKExt.jl:9.
ERROR: Method overwriting is not permitted during Module precompilation. Use `__precompile__(false)` to opt-out of precompilation.
[ Info: Skipping precompilation since __precompile__(false). Importing OptimizationMTKExt [ead85033-3460-5ce4-9d4b-429d76e53be9].

Environment (please complete the following information):

  • Output of using Pkg; Pkg.status()
Status `C:\Users\jaakkor2\MyTemp\mtkopt\environments\v1.10\Project.toml`
  [961ee093] ModelingToolkit v9.16.0
  [36348300] OptimizationOptimJL v0.3.2
  • Output of using Pkg; Pkg.status(; mode = PKGMODE_MANIFEST)
Status `C:\Users\jaakkor2\MyTemp\mtkopt\environments\v1.10\Manifest.toml`
  [47edcb42] ADTypes v1.2.1
  [1520ce14] AbstractTrees v0.4.5
  [7d9f7c33] Accessors v0.1.36
  [79e6a3ab] Adapt v4.0.4
  [66dad0bd] AliasTables v1.1.3
  [ec485272] ArnoldiMethod v0.4.0
  [4fba245c] ArrayInterface v7.11.0
  [4c555306] ArrayLayouts v1.9.3
  [e2ed5e7c] Bijections v0.1.6
  [62783981] BitTwiddlingConvenienceFunctions v0.1.5
  [2a0fbf3d] CPUSummary v0.2.5
  [00ebfdb7] CSTParser v3.4.3
  [49dc2e85] Calculus v0.5.1
  [d360d2e6] ChainRulesCore v1.24.0
  [fb6a15b2] CloseOpenIntervals v0.1.12
  [861a8166] Combinatorics v1.0.2
  [a80b9123] CommonMark v0.8.12
  [38540f10] CommonSolve v0.2.4
  [bbf7d656] CommonSubexpressions v0.3.0
  [34da2185] Compat v4.15.0
  [b152e2b5] CompositeTypes v0.1.4
  [a33af91c] CompositionsBase v0.1.2
  [2569d6c7] ConcreteStructs v0.2.3
  [88cd18e8] ConsoleProgressMonitor v0.1.2
  [187b0558] ConstructionBase v1.5.5
  [adafc99b] CpuId v0.3.1
  [a8cc5b0e] Crayons v4.1.1
  [9a962f9c] DataAPI v1.16.0
  [864edb3b] DataStructures v0.18.20
  [e2d170a0] DataValueInterfaces v1.0.0
  [2b5f629d] DiffEqBase v6.151.2
  [459566f4] DiffEqCallbacks v3.6.2
  [163ba53b] DiffResults v1.1.0
  [b552c78f] DiffRules v1.15.1
⌅ [a0c0ee7d] DifferentiationInterface v0.4.2
  [31c24e10] Distributions v0.25.109
  [ffbed154] DocStringExtensions v0.9.3
  [5b8099bc] DomainSets v0.7.14
  [fa6b7ba4] DualNumbers v0.6.8
  [7c1d4256] DynamicPolynomials v0.5.7
⌅ [06fc5a27] DynamicQuantities v0.13.2
  [4e289a0a] EnumX v1.0.4
  [f151be2c] EnzymeCore v0.7.3
  [d4d017d3] ExponentialUtilities v1.26.1
  [e2ba6199] ExprTools v0.1.10
  [7034ab61] FastBroadcast v0.3.2
  [9aa1b823] FastClosures v0.3.2
  [29a986be] FastLapackInterface v2.0.4
  [1a297f60] FillArrays v1.11.0
  [64ca27bc] FindFirstFunctions v1.2.0
  [6a86dc24] FiniteDiff v2.23.1
  [1fa38f19] Format v1.3.7
  [f6369f11] ForwardDiff v0.10.36
  [069b7b12] FunctionWrappers v1.1.3
  [77dc65aa] FunctionWrappersWrappers v0.1.3
  [d9f16b24] Functors v0.4.11
  [46192b85] GPUArraysCore v0.1.6
  [c145ed77] GenericSchur v0.5.4
  [c27321d9] Glob v1.3.1
  [86223c79] Graphs v1.11.0
  [3e5b6fbb] HostCPUFeatures v0.1.16
  [34004b35] HypergeometricFunctions v0.3.23
  [615f187c] IfElse v0.1.1
  [d25df0c9] Inflate v0.1.5
  [8197267c] IntervalSets v0.7.10
  [3587e190] InverseFunctions v0.1.14
  [92d709cd] IrrationalConstants v0.2.2
  [82899510] IteratorInterfaceExtensions v1.0.0
  [692b3bcd] JLLWrappers v1.5.0
  [682c06a0] JSON v0.21.4
  [98e50ef6] JuliaFormatter v1.0.56
  [ccbc3e58] JumpProcesses v9.11.1
  [ef3ab10e] KLU v0.6.0
  [ba0b0d4f] Krylov v0.9.6
  [5be7bae1] LBFGSB v0.4.1
  [b964fa9f] LaTeXStrings v1.3.1
  [2ee39098] LabelledArrays v1.16.0
  [984bce1d] LambertW v0.4.6
  [23fbe1c1] Latexify v0.16.3
  [10f19ff3] LayoutPointers v0.1.15
  [5078a376] LazyArrays v2.0.4
  [1d6d02ad] LeftChildRightSiblingTrees v0.2.0
  [d3d80556] LineSearches v7.2.0
  [7ed4a6bd] LinearSolve v2.30.1
  [2ab3a3ac] LogExpFunctions v0.3.28
  [e6f89c97] LoggingExtras v1.0.3
  [bdcacae8] LoopVectorization v0.12.170
  [d8e11817] MLStyle v0.4.17
  [1914dd2f] MacroTools v0.5.13
  [d125e4d3] ManualMemory v0.1.8
  [bb5d69b7] MaybeInplace v0.1.3
  [e1d29d7a] Missings v1.2.0
  [961ee093] ModelingToolkit v9.16.0
  [46d2c3a1] MuladdMacro v0.2.4
  [102ac46a] MultivariatePolynomials v0.5.5
  [d8a4904e] MutableArithmetics v1.4.4
  [d41bc354] NLSolversBase v7.8.3
  [77ba4419] NaNMath v1.0.2
  [8913a72c] NonlinearSolve v3.12.4
  [6fe1bfb0] OffsetArrays v1.14.0
  [429524aa] Optim v1.9.4
  [7f7a1694] Optimization v3.25.1
  [bca83a33] OptimizationBase v1.0.2
  [36348300] OptimizationOptimJL v0.3.2
  [bac558e1] OrderedCollections v1.6.3
  [1dea7af3] OrdinaryDiffEq v6.80.1
  [90014a1f] PDMats v0.11.31
  [65ce6f38] PackageExtensionCompat v1.0.2
  [d96e819e] Parameters v0.12.3
  [69de0a69] Parsers v2.8.1
  [e409e4f3] PoissonRandom v0.4.4
  [f517fe37] Polyester v0.7.14
  [1d0040c9] PolyesterWeave v0.2.1
  [85a6dd25] PositiveFactorizations v0.2.4
  [d236fae5] PreallocationTools v0.4.22
  [aea7be01] PrecompileTools v1.2.1
  [21216c6a] Preferences v1.4.3
  [33c8b6b6] ProgressLogging v0.1.4
  [92933f4c] ProgressMeter v1.10.0
  [43287f4e] PtrArrays v1.2.0
  [1fd47b50] QuadGK v2.9.4
  [e6cf234a] RandomNumbers v1.5.3
  [3cdcf5f2] RecipesBase v1.3.4
  [731186ca] RecursiveArrayTools v3.22.0
  [f2c3362d] RecursiveFactorization v0.2.23
  [189a3867] Reexport v1.2.2
  [ae029012] Requires v1.3.0
  [79098fc4] Rmath v0.7.1
  [7e49a35a] RuntimeGeneratedFunctions v0.5.13
  [94e857df] SIMDTypes v0.1.0
  [476501e8] SLEEFPirates v0.6.42
  [0bca4576] SciMLBase v2.39.0
  [c0aeaf25] SciMLOperators v0.3.8
  [53ae85a6] SciMLStructures v1.2.0
  [efcf1570] Setfield v1.1.1
  [727e6d20] SimpleNonlinearSolve v1.9.0
  [699a6c99] SimpleTraits v0.9.4
  [ce78b400] SimpleUnPack v1.1.0
  [a2af1166] SortingAlgorithms v1.2.1
  [47a9eef4] SparseDiffTools v2.19.0
  [0a514795] SparseMatrixColorings v0.3.2
  [e56a9233] Sparspak v0.3.9
  [276daf66] SpecialFunctions v2.4.0
  [aedffcd0] Static v0.8.10
  [0d7ed370] StaticArrayInterface v1.5.0
  [90137ffa] StaticArrays v1.9.4
  [1e83bf80] StaticArraysCore v1.4.2
  [82ae8749] StatsAPI v1.7.0
  [2913bbd2] StatsBase v0.34.3
  [4c63d2b9] StatsFuns v1.3.1
  [7792a7ef] StrideArraysCore v0.5.6
  [2efcf032] SymbolicIndexingInterface v0.3.22
  [19f23fe9] SymbolicLimits v0.2.1
  [d1185830] SymbolicUtils v2.0.2
  [0c5d862f] Symbolics v5.30.1
  [3783bdb8] TableTraits v1.0.1
  [bd369af6] Tables v1.11.1
⌅ [8ea1fca8] TermInterface v0.4.1
  [5d786b92] TerminalLoggers v0.1.7
  [8290d209] ThreadingUtilities v0.5.2
  [a759f4b9] TimerOutputs v0.5.24
  [0796e94c] Tokenize v0.5.29
  [d5829a12] TriangularSolve v0.2.0
  [410a4b4d] Tricks v0.1.8
  [781d530d] TruncatedStacktraces v1.4.0
  [5c2747f8] URIs v1.5.1
  [3a884ed6] UnPack v1.0.2
  [1986cc42] Unitful v1.20.0
  [a7c27f48] Unityper v0.1.6
  [3d5dd08c] VectorizationBase v0.21.68
  [19fa3120] VertexSafeGraphs v0.2.0
  [1d5cc7b8] IntelOpenMP_jll v2024.1.0+0
  [81d17ec3] L_BFGS_B_jll v3.0.1+0
  [856f044c] MKL_jll v2024.1.0+0
  [efe28fd5] OpenSpecFun_jll v0.5.5+0
  [f50d1b31] Rmath_jll v0.4.2+0
  [1317d2d5] oneTBB_jll v2021.12.0+0
  [0dad84c5] ArgTools v1.1.1
  [56f22d72] Artifacts
  [2a0f44e3] Base64
  [ade2ca70] Dates
  [8ba89e20] Distributed
  [f43a241f] Downloads v1.6.0
  [7b1f6079] FileWatching
  [9fa8497b] Future
  [b77e0a4c] InteractiveUtils
  [4af54fe1] LazyArtifacts
  [b27032c2] LibCURL v0.6.4
  [76f85450] LibGit2
  [8f399da3] Libdl
  [37e2e46d] LinearAlgebra
  [56ddb016] Logging
  [d6f4376e] Markdown
  [a63ad114] Mmap
  [ca575930] NetworkOptions v1.2.0
  [44cfe95a] Pkg v1.10.0
  [de0858da] Printf
  [3fa0cd96] REPL
  [9a3f8284] Random
  [ea8e919c] SHA v0.7.0
  [9e88b42a] Serialization
  [1a1011a3] SharedArrays
  [6462fe0b] Sockets
  [2f01184e] SparseArrays v1.10.0
  [10745b16] Statistics v1.10.0
  [4607b0f0] SuiteSparse
  [fa267f1f] TOML v1.0.3
  [a4e569a6] Tar v1.10.0
  [8dfed614] Test
  [cf7118a7] UUIDs
  [4ec0a83e] Unicode
  [e66e0078] CompilerSupportLibraries_jll v1.1.1+0
  [deac9b47] LibCURL_jll v8.4.0+0
  [e37daf67] LibGit2_jll v1.6.4+0
  [29816b5a] LibSSH2_jll v1.11.0+1
  [c8ffd9c3] MbedTLS_jll v2.28.2+1
  [14a3606d] MozillaCACerts_jll v2023.1.10
  [4536629a] OpenBLAS_jll v0.3.23+4
  [05823500] OpenLibm_jll v0.8.1+2
  [bea87d4a] SuiteSparse_jll v7.2.1+1
  [83775a58] Zlib_jll v1.2.13+1
  [8e850b90] libblastrampoline_jll v5.8.0+1
  [8e850ede] nghttp2_jll v1.52.0+1
  [3f19e933] p7zip_jll v17.4.0+2
Info Packages marked with ⌅ have new versions available but compatibility constraints restrict them from upgrading. To see why use `status --outdated -m`
  • Output of versioninfo()
Julia Version 1.10.4
Commit 48d4fd4843 (2024-06-04 10:41 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Windows (x86_64-w64-mingw32)
  CPU: 20 × 13th Gen Intel(R) Core(TM) i7-1370P
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-15.0.7 (ORCJIT, goldmont)
Threads: 1 default, 0 interactive, 1 GC (on 20 virtual cores)
Environment:
  JULIA_DEPOT_PATH = C:\Users\jaakkor2\MyTemp\mtkopt

Real sparsity support in Enzyme backend

Is your feature request related to a problem? Please describe.

The current Enyme implementation doesn't effectively leverage sparsity

Describe the solution you’d like

Something that looks like

if f.hess_prototype === nothing
                vdθ = Tuple((similar(r) for r in eachrow(I(length(θ)) * 1.0)))
                bθ = zeros(length(θ))
                @show bθ
                @show typeof(bθ)
                vdbθ = Tuple(zeros(length(θ)) for i in eachindex(θ))
                @show vdbθ
                @show typeof(vdbθ)
            else
                θ = SparseArrays.sparse(θ)
                @show θ
                vdθ = Tuple((similar(SparseArrays.sparse(r)) for r in eachrow(I(length(θ)) * 1.0)))
                @show vdθ
                @show typeof(vdθ)
                bθ = SparseArrays.similar(θ)
                @show bθ
                @show typeof(bθ)
                vdbθ = Tuple(similar(i) for i in eachrow(f.hess_prototype))
                @show vdbθ
                @show typeof(vdbθ)
            end

will need to be done

Describe alternatives you’ve considered

Maybe an automated sparsity detection version can exist as well but given recent benchmarking it is evident that this needs a rethinking of the way we are doing it

Additional context

Primary benchmark would be the cnlbeam one that already has a working Enzyme implementation. OPF might be worth revisiting with this as well.

cc: @wmoses I guess you'd be interested

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.