-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can DifferentiationInterface be useful for Turing? #2187
Comments
Turing's current interface to autodiff backends is based on |
Looks great! Is there a summary of how this is different from AbstractDifferentiation.jl somewhere?:) |
I updated the summary in this issue: JuliaDiff/AbstractDifferentiation.jl#131 |
Hi, I think |
Sounds reasonable, I have opened this issue to keep track: |
Closing this in favour of tpapp/LogDensityProblemsAD.jl#29; Turing will automatically use DI when tpapp/LogDensityProblemsAD.jl#29 is merged. |
The PR above (tpapp/LogDensityProblemsAD.jl#29) was closed, so we still aren't using DI. #2354 though will introduce a dependency on [email protected] through OptimizationBase. |
For the record I tried again with tpapp/LogDensityProblemsAD.jl#39, now that DI has all the features necessary, but @tpapp said he wants to redesign his library first |
If Turing maintainers need this I am happy to merge the above PR as is, and we can address those changes later. |
Assuming no known performance penalty exists, I think unifying the interfaces for various AutoDiff backends using DI is beneficial. DI also allows some new AutoDiff libraries (e.g. Enzyme, Mooncake) to focus more on the core functionality and less on supporting multiple interfaces. |
For Enzyme, as stated in DI's README, there can be some performance penalties or even correctness issues that come from not DI allowing multiple arguments. Apart from that and some StaticArrays shenanigans, I believe that most performance penalties between native backends and DI should either not exist or be easily fixable. Please open an issue with an MWE whenever you spot one. |
I'd suggest merging the DI PR on the understanding that @gdalle will help fix emerging (minor) performance issues, if any. |
Note that the PR mentioned above (tpapp/LogDensityProblemsAD.jl#39) only implements the use of DI for backends that were not supported by LogDensityProblemsAD. It will be up to that package to gradually remove their existing extensions (if they want) and let DI take over for them. |
Hi there!
@adrhill and I recently started https://github.com/gdalle/DifferentiationInterface.jl to provide a common interface for automatic differentiation in Julia. We're currently chatting with Lux.jl, Flux.jl and Optimization.jl to see how they can benefit from it, and so my mind went to Turing.jl as another AD power user :)
DifferentiationInterface.jl only guarantees support for functions of the type
f(x) = y
orf!(y, x)
with standard numbers or arrays in and out. Within these restrictions, we are compatible with 13 different AD backends, including the cool kids like Enzyme.jl and even the hipsters like Tapir.jl. Do you think it could come in handy?Ping @yebai @willtebbutt
The text was updated successfully, but these errors were encountered: