-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Laplace approximation as algorithm #16
base: master
Are you sure you want to change the base?
Changes from 4 commits
cfd727b
327710b
2e29c3e
39eaa82
eb41985
cf45304
914a47f
98b5395
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,99 @@ | ||
- Feature Name: Allow Cmdstan to optionally print out draws from Laplace approximation of posterior when optimizing | ||
- Start Date: 2020-03-08 | ||
- RFC PR: 16 | ||
- Stan Issue: | ||
|
||
# Summary | ||
[summary]: #summary | ||
|
||
When computing a MAP estimate, the Hessian of the log density can be used to construct a normal approximation to the posterior. | ||
|
||
# Motivation | ||
[motivation]: #motivation | ||
|
||
I have a Stan model that is too slow to practically sample all the time. Because optimization seem to give reasonable results, it would be nice to have the normal approximation to the posterior to give some sense of the uncertainty in the problem as well. | ||
|
||
It is standard to compute a normal approximation to the posterior covariance comes from the inverse of the Hessian of the negative log density. | ||
|
||
If the MAP estimate is ```u```, and the Hessian of the unnormalized log density at u is ```H```, then a posterior draw on the constrained scale is: | ||
|
||
``` | ||
unconstrained_sample = multivariate_normal_rng(mean = u, cov = -inverse(H)) | ||
constrained_sample = constrain(unconstrained_sample) | ||
``` | ||
|
||
We can output unnormalized log densities of the actual model and the approximate model to compute importance sampling diagnostics and estimates. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We can output unnormalized log densities in the unconstrained space of the true posterior and the approximate posterior |
||
|
||
Rstan already supports this via the 'hessian' argument to 'optimizing'. | ||
|
||
# Guide-level explanation | ||
[guide-level-explanation]: #guide-level-explanation | ||
|
||
This adds two arguments to the cmdstan interface. | ||
|
||
```laplace_draws``` - The number of draws to take from the posterior approximation. By default, this is zero, and no laplace approximation is done | ||
|
||
```laplace_add_diag``` - A value to add to the diagonal of the hessian approximation to fix small non-singularities (defaulting to zero) | ||
|
||
The output is printed after the optimimum. | ||
|
||
A model can be called by: | ||
``` | ||
./model optimize laplace_draws=100 data file=data.dat | ||
``` | ||
|
||
or with the diagonal: | ||
``` | ||
./model optimize laplace_draws=100 laplace_add_diag=1e-10 data file=data.dat | ||
``` | ||
|
||
Optimizing output currently looks like: | ||
``` | ||
# stan_version_major = 2 | ||
... | ||
# refresh = 100 (Default) | ||
lp__,b.1,b.2 | ||
3427.64,7.66366,5.33466 | ||
``` | ||
|
||
The new output would look like: | ||
|
||
``` | ||
# stan_version_major = 2 | ||
... | ||
# refresh = 100 (Default) | ||
lp__,b.1,b.2 | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can you add There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can evaluate both the true (edit: unnormalized) log density and the log density of the normal approximation. Just want one or both? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Both |
||
3427.64,7.66366,5.33466 | ||
# Draws from Laplace approximation: | ||
lp__, log_p, log_g, b.1,b.2 | ||
0, -1, -2, 7.66364,5.33463 | ||
0, -2, -3, 7.66367,5.33462 | ||
``` | ||
|
||
The lp__, log_p, log_g formatting is intended to mirror the advi output. | ||
|
||
# Reference-level explanation | ||
[reference-level-explanation]: #reference-level-explanation | ||
|
||
As far as computing the Hessian, because the higher order autodiff doesn't work with the ODE solvers 1D integrator and such, I think we should compute the Hessian with finite differences, and we use the sample finite difference implementation that the test framework does (https://github.com/stan-dev/math/blob/develop/stan/math/prim/functor/finite_diff_hessian_auto.hpp) | ||
|
||
Ben Goodrich points out it would be better to get this Hessian using finite differences of first order gradients. He is correct, but the fully finite differenced hessian is what is implemented in Stan math currently and so that is what I'm rolling with. | ||
|
||
# Drawbacks | ||
[drawbacks]: #drawbacks | ||
|
||
Providing draws instead of the Laplace approximation itself is rather inefficient, but it is the easiest thing to code. | ||
|
||
We also have to deal with possible singular Hessians. This is why I also added the laplace_add_diag to overcome these. They'll probably be quite common, especially with the Hessians computed with finite differences. | ||
|
||
# Rationale and alternatives | ||
[rationale-and-alternatives]: #rationale-and-alternatives | ||
|
||
Another design would be the print the Hessian on the unconstrained space and let users handle the sampling and the parameter transformation. The issue here is there is no good way for users to do these parameter transformations outside of certain interfaces (at least Rstan, maybe PyStan). | ||
|
||
Another design would be to print a Hessian on the constrained space and let users handle the sampling. In this case users would also be expected to handle the constraints, and I don't know how that would work practically rejection sampling maybe?) | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Reading this I realized that it is not mentioned before what the draws are. I assume that the proposed approach is to draw from multivariate normal and then transform to the constrained space? If so it would be good to explicitly write what are the values for the "Draws from the Laplace approximation" in the new output. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Will add more. If
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. And then transform to the constrained space? |
||
# Prior art | ||
[prior-art]: #prior-art | ||
|
||
Rstan does a version of this already. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This leaves it unclear if the RStan version is different that the design here. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I should check that it's actually the same. @bgoodri is this actually the same? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove "comes" (I would remove it myself, but I don't have edit rights. It would be useful to have edit rights for design-docs repo)