-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
remaining autodiff functors #84
Comments
The doc for
which the model_base For the others, they don't document their requirements nearly as nicely, but as far as I can tell they only require I'm not sure how nicely optional symbols play with the various language interfaces. We currently don't have any functions which exist conditionally, so I'd be curious how that plays out |
There are two versions of Hessian.
We want to use approach (1). They implement the same interface, so it's just a matter of changing the include. |
It seems to me like third-order autodiff should require that extra level of nesting? But I may be imagining this incorrectly. |
Is there any reason it's not calling one of the predefined versions? The finite diff version has the same functional pattern. It should be able to take the same model functor we create for other uses.
Third-order autodiff in Stan requires |
We must be talking past each other:
|
Yes, I misread "grad Hessian" as "Hessian". Maybe I'm misunderstanding what you mean by "upstream", but the |
The issue isn't that fewer models would work or anything, it's that at the moment we can only call overloads which exist in the |
You'd think I'd remember this given that I coded it the first time around. The templated version is defined on the actual run-time class, but I keep forgetting that the class instance is assigned to a variable typed as the base class, which only knows the virtual functions (hence no templates). It'd be easy enough to add these the same way as last time. But you want to be careful because each layer is going to greatly extend the compile time, so we only want to turn these features on when they're going to be used. |
There are a bunch of autodiff functors that are implemented in Stan but not exposed yet in BridgeStan. The two most basic are already done. Most of them other than directional derivatives require forward-mode autodiff. Please feel free to add more requests to the list.
grad_Hessian
instan/math/mix
; requires forward mode)gradient_dot_vector
instan/math/mix
; most efficient in forward, can do backward)hessian_times_vector
instan/math/mix
; requires forward mode)grad_tr_mat_times_hessian
instan/math/mix
; requires forward mode)There's no inverse Hessian vector product in Stan. I'm not sure the best way to implement that---I think there are a lot of approaches because the direct way is so expensive (
vector / Hessian
in Stan).The text was updated successfully, but these errors were encountered: