Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement ML tasks #21

Open
Justinezgh opened this issue Feb 16, 2022 · 13 comments
Open

Implement ML tasks #21

Justinezgh opened this issue Feb 16, 2022 · 13 comments
Assignees

Comments

@Justinezgh
Copy link
Contributor

Justinezgh commented Feb 16, 2022

Tasks (from https://arxiv.org/pdf/2101.04653.pdf):

  • Gaussian 2D (✔️)

image

  • Gaussian mixture(✔️)

image

  • Two Moons (✔️ but not 100% sure )

image

  • Lotka-Volterra (⌛)

image

@Justinezgh
Copy link
Contributor Author

notebook : here

@Justinezgh
Copy link
Contributor Author

For the 2 moons, should I use dist.TransformedDistribution(distribution, transform) from numpyro to get the simulation x ?

@EiffL
Copy link
Contributor

EiffL commented Feb 18, 2022

Hum you don't necessarily need to use numpyro for anything here. You can do everything in tfp

@Justinezgh
Copy link
Contributor Author

I'm not sure to understand, can we get p(theta|x,z) with tfp ?

@EiffL
Copy link
Contributor

EiffL commented Feb 18, 2022

Hummmm I'm not sure I understand either ^^ numpyro doesn't give you p(theta|x,z) either?

@Justinezgh
Copy link
Contributor Author

ok I'm lost ^^

@Justinezgh
Copy link
Contributor Author

Lokta-Volterra: notebook

I still have a pb with vmap() or Independent() because when I increase the time or the batch_size the score increases too. But observations seems ok
image

@EiffL
Copy link
Contributor

EiffL commented Feb 24, 2022

Cool cool cool :-D

@Justinezgh
Copy link
Contributor Author

image

from: https://arxiv.org/abs/1905.07488

@Justinezgh
Copy link
Contributor Author

I learned Lokta Volterra posterior with 2 different compressors: one trained with MSE and one with VMIM.

@EiffL
Copy link
Contributor

EiffL commented Mar 25, 2022

That's awesome! Very nice! So the regression compression doesn't get any better than that no matter what you do?

@EiffL
Copy link
Contributor

EiffL commented Mar 25, 2022

I'll just ping @dlanzieri about this because this is exactly the sort of thing we are interested in regarding this compression question

@Justinezgh
Copy link
Contributor Author

you can probably do better ^^ but the regression seems not that bad:
(orange points are jnp.abs(reg[:,j]-p[:,j])>0.05 with p the parameters and reg the prediction)
image
image
image
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants