-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ELBO function #9
Comments
We ended up using the |
Yes, it helps. Thank you for replying. Just so that you know, I am analyzing your code. I converted it to run in anaconda3 (Python 3.6 + Tensorflow 1.1 (or 1.2: do not recall)). The conversion was very straightforward. One additional question and a comment: thanks again. Gordon. |
Regarding the multiple encoders, we were trying to measure the performance difference between using a simple encoder and three types of normalizing flows ('residual', householder, inverse autoregressive), while keeping everything else constant. Introducing the flows only affects the encoder, so we just keep the decoder the same for all four cases. Regarding tanh, we followed the settings found in version 1 of the Householder flows paper since it was fairly comprehensive (they used tanh there). Since then this paper's been updated, but we weren't able to get the results reported in version 1 |
Thanks for the update. It is very sad that so many paper do not provide
enough information for the duplication of results. I am interested in your
code because we are also working to compare these different methods (IAF
and NF, but with a view to applying them to the prior as opposed to the
posterior.)
I noticed that you not using conv_net? Is this module debugged?
…On Tue, May 30, 2017 at 9:35 AM, Sean Welleck ***@***.***> wrote:
Regarding the multiple encoders, we were trying to measure the performance
difference between using a simple encoder and three types of normalizing
flows ('residual', householder, inverse autoregressive), while keeping
everything else constant. Introducing the flows only affects the encoder,
so we just keep the decoder the same for all four cases.
Regarding tanh, we followed the settings found in version 1 of the
Householder flows paper <https://arxiv.org/pdf/1611.09630v1.pdf> since it
was fairly comprehensive (they used tanh there). Since then this paper's
been updated, but we weren't able to get the results reported in version 1
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#9 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAT0ZAKBDj4zIW42IzfeHff4zKmRpHNOks5r_BsOgaJpZM4Nozk_>
.
--
Gordon Erlebacher
Chair, Department of Scientific Computing
|
I see, thanks. Right - we didn't use the conv net since we were just testing on MNIST, but in general it could be good to use. However, I don't think we tested the conv_net code so it might need some minor changes |
Hi,
The Elbo function returns monitor_functions, a dictionary of elements to monitor.
In train.py and evaluation.py, you call elbo_loss, which returns monitor_functions. So far so good. In train.py, you call optimizer.minimize(loss_op), where loss_op is the return value to the elbow function
(line 259 in train.py). minimize() should take the function to be minimized as argument.
Perhaps there is a better explanation for how the code is written since it is unlikely you could get the code to work if this is an error.
I just realized that the code calls train() and not train_simple(). The issue I mention above is in train_simple(). I assume it is an error?
Thank you.
The text was updated successfully, but these errors were encountered: