Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ReverseDiff support #144

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

ReverseDiff support #144

wants to merge 1 commit into from

Conversation

mohamed82008
Copy link
Member

This PR implements ReverseDiff support which uncovered a spooky bug. Here is a reproducer that gives a wrong gradient. I will explain the spooky part and how to get the correct gradient in a comment.

using Revise, SimpleChains, ForwardDiff, ReverseDiff

x = [0.7, -0.5]
p = [-0.3, 0.6]
sc = SimpleChain(static(2), TurboDense{false}(identity, 1))
f(p) = SimpleChains.call_chain(sc, x, p)[1]
f(p)
f(p)

ForwardDiff.gradient(f, p)
ReverseDiff.gradient(f, p)

v, pb = _rrule(sc, argv, paramsv, _returns_scalar(sc))
return v, Δ -> begin
_Δ = Base.tail(pb(collect(Δ)))
_Δ = Base.tail(pb(collect(Δ)))
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you comment this line out, you get the correct gradient. deepcopying everything didn't fix this so I am not sure what's going on.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So call it once, it works. Call it twice, it breaks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deepcopy isn't likely to help on pointers.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps the _rrule function should be a struct which stores the initial pointer address and starts from there every time it's called

@chriselrod
Copy link
Contributor

chriselrod commented May 26, 2023

I would suggest taking this approach:
https://github.com/JuliaSIMD/LoopVectorization.jl/blob/8fe27b9014f16b356e09d0b108968402253fde90/src/LoopVectorization.jl#L265C1-L268

That is, in Julia versions without package extensions, LoopVectorization.jl loads the extension files.
In versions with package extensions, they'll instead be loaded lazily as an extension (while still benefiting from precompilation).

I think it'd make sense to give StaticArrays and ForwardDiff that same treatment, but that is obviously unrelated to this PR.

@mohamed82008
Copy link
Member Author

yes I will do that as soon as it works well

offset += 1
else
l = length(y)
v[offset + 1 : offset + l] = vec(y)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that vec allocates if !isa(y,AbstractVector)

This looks performance sensitive. Could use an @inbounds?

But also, why?
This seems internal. We should probably only be rruling relatively high level SimpleChains calls, that hide things like params getting called?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was called in the log prior calculation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants