Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Directly call ForwardDiff to compute the hessian #1270

Open
wants to merge 3 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions src/lib/grad.jl
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,7 @@ julia> hessian(sin, pi/2)
"""
hessian(f, x) = hessian_dual(f, x)

hessian_dual(f, x::AbstractArray) = forward_jacobian(x -> gradient(f, x)[1], x)[2]

hessian_dual(f, x::AbstractArray) = ForwardDiff.jacobian(x -> gradient(f, x)[1], x)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The example you test relies on the gradient rule for ForwardDiff.jacobian, which assumes its f is pure. This is true here.

It would be worth trying to look at some examples where this isn't true, i.e. f closes over some parameters (either arrays or numbers). In this case their gradients are not traced. I believe you should get a warning from ForwardDiff.jacobian's rule.

Are there any further surprises? Anything which could be tested?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you mean nn layers from Flux? I will dig in a bit tomorrow.

The scalar indexing in the above comment can be added to testing as well.

hessian_dual(f, x::Number) = ForwardDiff.derivative(x -> gradient(f, x)[1], x)

"""
Expand Down Expand Up @@ -234,11 +233,11 @@ end
diaghessian(f, args...) -> Tuple

Diagonal part of the Hessian. Returns a tuple containing, for each argument `x`,
`h` of the same shape with `h[i] = Hᵢᵢ = ∂²y/∂x[i]∂x[i]`.
`h` of the same shape with `h[i] = Hᵢᵢ = ∂²y/∂x[i]∂x[i]`.
The original evaluation `y = f(args...)` must give a real number `y`.

For one vector argument `x`, this is equivalent to `(diag(hessian(f,x)),)`.
Like [`hessian`](@ref) it uses ForwardDiff over Zygote.
Like [`hessian`](@ref) it uses ForwardDiff over Zygote.

!!! warning
For arguments of any type except `Number` & `AbstractArray`, the result is `nothing`.
Expand Down
8 changes: 7 additions & 1 deletion test/utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,19 @@ using ForwardDiff
using Zygote: hessian_dual, hessian_reverse

@testset "hessian: $hess" for hess in [hessian_dual, hessian_reverse]
function f1(x, bias)
h = hess(x -> sum(x.^3), x)
return h * x .+ bias
end

if hess == hessian_dual
@test hess(x -> x[1]*x[2], randn(2)) ≈ [0 1; 1 0]
@test hess(((x,y),) -> x*y, randn(2)) ≈ [0 1; 1 0] # original docstring version
@test gradient(b->sum(f1(rand(3),b)),rand(3))[1] ≈ [1, 1, 1]
else
@test_broken hess(x -> x[1]*x[2], randn(2)) ≈ [0 1; 1 0] # can't differentiate ∇getindex
@test_broken hess(((x,y),) -> x*y, randn(2)) ≈ [0 1; 1 0]
@test_broken gradient(b->sum(f1(rand(3),b)),rand(3))[1] ≈ [1, 1, 1] # jacobian is not differentiable
end
@test hess(x -> sum(x.^3), [1 2; 3 4]) ≈ Diagonal([6, 18, 12, 24])
@test hess(sin, pi/2) ≈ -1
Expand Down Expand Up @@ -133,7 +139,7 @@ using ForwardDiff
g3(x) = sum(abs2,ForwardDiff.jacobian(f,x))
out,back = Zygote.pullback(g3,[2.0,3.2])
@test back(1.0)[1] == ForwardDiff.gradient(g3,[2.0,3.2])

# From https://github.com/FluxML/Zygote.jl/issues/1218
f1218(x::AbstractVector,y::AbstractVector) = sum(x)*sum(y)
gradf1218(x,y) = ForwardDiff.gradient(x->f1218(x,y), x)[1]
Expand Down