-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Active return values with automatic pullback (differential return value) deduction only supported for floating-like values and not type EnzymeCore.Active{Float64}
#1929
Comments
So it says that the return type of the function you're differentiating is not a float, and an Active{Float}. Only float-like operations are supported as return types for reverse mode. |
Is this a green light for using DI on everything in |
No, this was just helping @gdalle understand what an error message means |
Ah, I should have paid more attention. My question should have been: "Can we use DI on everything once DI calls |
I’m not sure, it depends on what and how DI implements things. In general I’m somewhat skeptical that this will be the case. In losing out on the fine grain control of a direct autodiff in user code, you’re probably going to have to trade off compatibility, performance, or both. For codes that want to quickly adopt or test out a bunch of AD backends, DI is the right tool for the job. However, in the case that either performance or ensuring as many codes as possible are supported is critical, I’d probably recommend directly calling Enzyme. Especially for places that already have that direct Enzyme support added, I don’t think it’s worth the potential loss by getting rid of it at this time. |
I'm sorry I still don't understand what this means in concrete terms. Would you know what I need to change in the code below to make it work for higher-order like |
Hi there! I'm trying to optimize
DifferentiationInterface.gradient
by usingautodiff
everywhere, as discussed with @wsmoses in TuringLang/AdvancedVI.jl#98 (comment).In the PR JuliaDiff/DifferentiationInterface.jl#515, going from
Enzyme.gradient
toEnzyme.autodiff
in DI caused new bugs for second-order differentiation (both forward-over-reverse and reverse-over-reverse).Error message:
Stacktrace
Can you tell us what this message means? I'm assuming there are tricks in
Enzyme.gradient
that give it better compatibility with nested autodiff? How can I reproduce them?Related issues:
Enzyme.gradient
, useEnzyme.autodiff
directly JuliaDiff/DifferentiationInterface.jl#512The text was updated successfully, but these errors were encountered: