Comments (13)
Is adding Symbolics.jl
also planned?
In my field (control theory), symbolic differentiation is almost exclusively used since it gives speed when derivatives need to be calculated multiple times due to a lack of overhead of logic from the ADs calculating the forward pass and allocations. This is also the case for machine learning with constant graph, which can also benefit from this when common sub-expression elimination (cse
) from Symbolics.jl
is fully functional.
Calculating the derivative would happen by symbolically tracing the function and generating the derivative/gradient/jacobian function, then passing the inputs to the function.
This is useful when caches are added to this package, for Symbolics.jl
the cache would just be the generated derivative function, resulting in no overhead in calculating the derivative.
from abstractdifferentiation.jl.
Yota is ChainRules-compatible, so it should be covered with the others.
from abstractdifferentiation.jl.
Will do! Should I start with the public API? @frankschae said you had mentioned we might want to use some internal functions (he pointed me to https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/compiler.jl#L1745)
from abstractdifferentiation.jl.
Can you add Yota also ?
from abstractdifferentiation.jl.
Make sure to add both Enzyme forward and reverse modes!
from abstractdifferentiation.jl.
I might not go quite that low level to save yourself some common LLVM setup, but probably using the thunk level (https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/compiler.jl#L2700) which has options for "combined" augmented forward pass+gradient, an augmented forward pass (storing values from the original function that need preservation), a standalone gradient (just running the reverse, using the stored values from an augmented forward pass), and forward mode AD.
This is used, for example, to generate the high-level autodiff/fwddiff routines (https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/Enzyme.jl#L173) and is currently the highest-level point that exposes "split mode" [e.g. the split augmented forward pass and standalone gradient]
from abstractdifferentiation.jl.
I would like to add a "batch" version of Zygote as a backend which falls back on Zygote except for jacobian
where the pullback is called with all the bases simultaneously (i.e. pb(I)
where I
is the identity matrix). This can be useful to preserve sparsity of Jacobians if all the rules are written in a way that preserves sparsity.
from abstractdifferentiation.jl.
And a SparseDiffTools
backend to optimise for sparsity structure
from abstractdifferentiation.jl.
I would like to add a "batch" version of Zygote as a backend which falls back on Zygote except for
jacobian
where the pullback is called with all the bases simultaneously (i.e.pb(I)
whereI
is the identity matrix). This can be useful to preserve sparsity of Jacobians if all the rules are written in a way that preserves sparsity.
Is this a feature Zygote actually supports, or just something that sometimes works?
from abstractdifferentiation.jl.
It requires that the function being differentiated has independent actions on each column. For example, a neural network satisfies this.
from abstractdifferentiation.jl.
or just something that sometimes works?
Something that sometimes works. The goal is to make it easy to define a sparse Jacobian in a rrule and then get it back when calling Zygote.jacobian
.
from abstractdifferentiation.jl.
I think it would be good to support this. As you say, this would require support for caching. See #41.
from abstractdifferentiation.jl.
Via GitHub advanced search, I found some other AD packages as well:
- gdalle/ImplicitDifferentiation.jl
- avigliotti/AD4SM.jl
- JuliaDiff/TaylorDiff.jl
- abap34/JITrench.jl
- sshin23/MadDiff.jl
from abstractdifferentiation.jl.
Related Issues (20)
- Use multiple arguments instead of a tuple for pushforward and pullback function? HOT 4
- `AD.jacobian` much slower than `Zygote.jacobian` HOT 3
- JET.jl reports possible errors on `AD.gradient` that do not appear with `Zygote.gradient` HOT 3
- Code inside function with rrule should not run HOT 2
- Using AbstractDifferentiation and Zygote as package dependencies errors: `UndefVarError: ZygoteBackend not defined`
- Jacobians for functions beyond vector-to-vector?
- Whats the reason for the derivative / gradient difference?
- AD failure where Zygote succeeds HOT 3
- ForwardDiff is broken for differently sized inputs
- API for user code to detect if it's being differentiated HOT 11
- Feature request: option to turn off ForwardDiff tagging
- Zygote context cache incorrectly(?) persists between AD calls
- Missing `value_derivative_and_second_derivative`?
- Replace Requires.jl with conditional dependencies for Julia 1.9 HOT 2
- LoadError: UndefVarError: StaticArrays not defined HOT 2
- add `ForwardRuleConfigBackend` (to match `ReverseRuleConfigBackend`) HOT 7
- Lazy Jacobian multiplication HOT 6
- ERROR: MethodError: no method matching ZygoteBackend() HOT 1
- `lazy_jacobian` is less generic than `pullback_function`
- Enzyme support HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from abstractdifferentiation.jl.