Comments (1)
It's relatively new, 2 months old (not 2 years, only for the LICENCE, think it doesn't apply).
Nope it is 2 years old haha
but it seems the worst of both worlds to have a mix of US and British as you do, so could it be renamed to OptimizationOptimizers, or are we stuck with the s? Or even rename to Sophia.jl if possible if that's the main or only one there?
More generally, about renaming (sub)packages. is it worth it, and should we have a policy of registering US only from now on, or renaming to, or even allow both spellings...?
The point is that it's a wrapper over the Optimisers.jl package you linked above, we follow the convention of Optimization*
with the wrapped package's name being *
.
I recently learned of Sophia and it seems great the future for all. Until even better made that I can't rule out... At least do you know if it's a good/fast Julia implementation, or some reason to use e.g. rather Python?
It isn't the best possible implementation but shouldn't be terrible either depending a lot on the AD backend. The hessian-vector product support here has some limitations but might improve in the near future.
from optimization.jl.
Related Issues (20)
- Support LBFGSB.jl HOT 12
- Include `searchdirection` in `OptimizationState` HOT 2
- The `callback` appears to be called for linesearch iterations HOT 1
- `PolyOpt` only accept functions without any extra inputs HOT 1
- Augmented Lagrangian HOT 5
- Multithreading support for Optimizers like BBO
- Is there currently a feasible way to use NamedTuple or ComponentArray as x0 for Optimization.jl HOT 1
- How to get latest minimizer (u) instead of the best one for optimization? HOT 5
- Callback signature doesn't match with the docs for PRIMA HOT 1
- Optimization.LBFGS() can not compute the gradient HOT 5
- BBO always returns retcode Failure HOT 1
- ERROR: UndefVarError: `NoAD` not defined HOT 5
- callback-generated stopping criteria no longer work in OptimizationNLopt HOT 9
- BFGS fails with user-supplied derivative and bounds (regression in v3.25.0, v3.25.1)
- OptimizationOptimJL.Optim.BFGS() missing tests for user-supplied derivatives (grad), upper/lower bounds, and the combination
- Reopen issue 755 : OptimizationOptimJL.Optim.BFGS() missing tests for user-supplied derivatives (grad), upper/lower bounds, and the combination
- FORCED_STOP from OptimizationNLopt callback returns best point PRIOR to callback, making it unuseful for custom stopping criteria. Is that intended? HOT 2
- Update `OptimizationPRIMA` `[compat]` for `PRIMA` to `0.2.0` HOT 1
- OptimizationOptimJL fails to precompile HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optimization.jl.