Giter Club home page Giter Club logo

Comments (13)

JTaets avatar JTaets commented on July 18, 2024 2

Is adding Symbolics.jl also planned?

In my field (control theory), symbolic differentiation is almost exclusively used since it gives speed when derivatives need to be calculated multiple times due to a lack of overhead of logic from the ADs calculating the forward pass and allocations. This is also the case for machine learning with constant graph, which can also benefit from this when common sub-expression elimination (cse) from Symbolics.jl is fully functional.

Calculating the derivative would happen by symbolically tracing the function and generating the derivative/gradient/jacobian function, then passing the inputs to the function.

This is useful when caches are added to this package, for Symbolics.jl the cache would just be the generated derivative function, resulting in no overhead in calculating the derivative.

from abstractdifferentiation.jl.

sethaxen avatar sethaxen commented on July 18, 2024 1

Yota is ChainRules-compatible, so it should be covered with the others.

from abstractdifferentiation.jl.

sethaxen avatar sethaxen commented on July 18, 2024 1

Will do! Should I start with the public API? @frankschae said you had mentioned we might want to use some internal functions (he pointed me to https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/compiler.jl#L1745)

from abstractdifferentiation.jl.

AriMKatz avatar AriMKatz commented on July 18, 2024

Can you add Yota also ?

from abstractdifferentiation.jl.

wsmoses avatar wsmoses commented on July 18, 2024

Make sure to add both Enzyme forward and reverse modes!

from abstractdifferentiation.jl.

wsmoses avatar wsmoses commented on July 18, 2024

I might not go quite that low level to save yourself some common LLVM setup, but probably using the thunk level (https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/compiler.jl#L2700) which has options for "combined" augmented forward pass+gradient, an augmented forward pass (storing values from the original function that need preservation), a standalone gradient (just running the reverse, using the stored values from an augmented forward pass), and forward mode AD.

This is used, for example, to generate the high-level autodiff/fwddiff routines (https://github.com/wsmoses/Enzyme.jl/blob/2ce81ffa8f56c5bf44a4d85234c2110fa9d6eb0a/src/Enzyme.jl#L173) and is currently the highest-level point that exposes "split mode" [e.g. the split augmented forward pass and standalone gradient]

from abstractdifferentiation.jl.

mohamed82008 avatar mohamed82008 commented on July 18, 2024

I would like to add a "batch" version of Zygote as a backend which falls back on Zygote except for jacobian where the pullback is called with all the bases simultaneously (i.e. pb(I) where I is the identity matrix). This can be useful to preserve sparsity of Jacobians if all the rules are written in a way that preserves sparsity.

from abstractdifferentiation.jl.

mohamed82008 avatar mohamed82008 commented on July 18, 2024

And a SparseDiffTools backend to optimise for sparsity structure

from abstractdifferentiation.jl.

sethaxen avatar sethaxen commented on July 18, 2024

I would like to add a "batch" version of Zygote as a backend which falls back on Zygote except for jacobian where the pullback is called with all the bases simultaneously (i.e. pb(I) where I is the identity matrix). This can be useful to preserve sparsity of Jacobians if all the rules are written in a way that preserves sparsity.

Is this a feature Zygote actually supports, or just something that sometimes works?

from abstractdifferentiation.jl.

ChrisRackauckas avatar ChrisRackauckas commented on July 18, 2024

It requires that the function being differentiated has independent actions on each column. For example, a neural network satisfies this.

from abstractdifferentiation.jl.

mohamed82008 avatar mohamed82008 commented on July 18, 2024

or just something that sometimes works?

Something that sometimes works. The goal is to make it easy to define a sparse Jacobian in a rrule and then get it back when calling Zygote.jacobian.

from abstractdifferentiation.jl.

sethaxen avatar sethaxen commented on July 18, 2024

I think it would be good to support this. As you say, this would require support for caching. See #41.

from abstractdifferentiation.jl.

prbzrg avatar prbzrg commented on July 18, 2024

Via GitHub advanced search, I found some other AD packages as well:

  • gdalle/ImplicitDifferentiation.jl
  • avigliotti/AD4SM.jl
  • JuliaDiff/TaylorDiff.jl
  • abap34/JITrench.jl
  • sshin23/MadDiff.jl

https://github.com/search?l=&o=desc&q=Automatic+Differentiation+stars%3A%3E10+pushed%3A%3E2022-01-01+language%3AJulia&s=stars&type=Repositories

from abstractdifferentiation.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.