diff --git a/examples/Computing log determinants.md b/examples/Computing log determinants.md new file mode 100644 index 0000000..17e236c --- /dev/null +++ b/examples/Computing log determinants.md @@ -0,0 +1,24 @@ +# Computing the log determinant of the Jacobian + +We show how to compute and retrieve the log determinant of the Jacobian of a bijective transformation. +We use Real NVP as an example, but you can replace it with any other bijection from `normalizing_flows.bijections`. +The code is as follows: + +```python +import torch +from normalizing_flows import Flow +from normalizing_flows.bijections import RealNVP + +torch.manual_seed(0) + +batch_shape = (5, 7) +event_shape = (2, 3) +x = torch.randn(size=(*batch_shape, *event_shape)) +z = torch.randn(size=(*batch_shape, *event_shape)) + +bijection = RealNVP(event_shape=event_shape) +flow = Flow(bijection) + +_, log_det_forward = flow.bijection.forward(x) +_, log_det_inverse = flow.bijection.inverse(z) +``` \ No newline at end of file diff --git a/examples/Modifying architectures.md b/examples/Modifying architectures.md new file mode 100644 index 0000000..aec3d68 --- /dev/null +++ b/examples/Modifying architectures.md @@ -0,0 +1,33 @@ +# Creating and modifying bijection architectures + +We give an example on how to modify a bijection's architecture. +We use the Masked Autoregressive Flow (MAF) as an example. +We can manually set the number of invertible layers as follows: +```python +from normalizing_flows.bijections import MAF + +event_shape = (10,) +flow = MAF(event_shape=event_shape, n_layers=5) +``` + +For specific changes, we can create individual invertible layers and combine them into a bijection. +MAF uses affine masked autoregressive layers with permutations in between. +We can import these layers set their parameters as desired. +For example, to change the number of layers in the MAF conditioner and its hidden layer sizes, we proceed as follows: +```python +from normalizing_flows.bijections import BijectiveComposition +from normalizing_flows.bijections.finite.autoregressive.layers import AffineForwardMaskedAutoregressive +from normalizing_flows.bijections.finite.linear import ReversePermutation + +event_shape = (10,) +flow = BijectiveComposition( + event_shape=event_shape, + layers=[ + AffineForwardMaskedAutoregressive(event_shape=event_shape, n_layers=4, n_hidden=20), + ReversePermutation(event_shape=event_shape), + AffineForwardMaskedAutoregressive(event_shape=event_shape, n_layers=3, n_hidden=7), + ReversePermutation(event_shape=event_shape), + AffineForwardMaskedAutoregressive(event_shape=event_shape, n_layers=5, n_hidden=13) + ] +) +``` \ No newline at end of file diff --git a/examples/README.md b/examples/README.md deleted file mode 100644 index 9cadc02..0000000 --- a/examples/README.md +++ /dev/null @@ -1,46 +0,0 @@ -# Examples - -We provide minimal working examples on how to perform various common tasks with normalizing flows. -We use Real NVP as an example, but you can replace it with any other bijection from `normalizing_flows.bijections`. - -## Training a normalizing flow on a fixed dataset -```python -import torch -from normalizing_flows import Flow -from normalizing_flows.bijections import RealNVP - -torch.manual_seed(0) - -# We support arbitrary event and batch shapes -event_shape = (2, 3) -batch_shape = (5, 7) -x_train = torch.randn(size=(*batch_shape, *event_shape)) - -bijection = RealNVP(event_shape=event_shape) -flow = Flow(bijection) - -flow.fit(x_train, show_progress=True) -``` - -## Computing the log determinant of the Jacobian transformation given a Flow -```python -import torch -from normalizing_flows import Flow -from normalizing_flows.bijections import RealNVP - -torch.manual_seed(0) - -batch_shape = (5, 7) -event_shape = (2, 3) -x = torch.randn(size=(*batch_shape, *event_shape)) -z = torch.randn(size=(*batch_shape, *event_shape)) - -bijection = RealNVP(event_shape=event_shape) -flow = Flow(bijection) - -_, log_det_forward = flow.bijection.forward(x) -# log_det_forward.shape == batch_shape - -_, log_det_inverse = flow.bijection.inverse(z) -# log_det_inverse.shape == batch_shape -``` \ No newline at end of file diff --git a/examples/Training a normalizing flow.md b/examples/Training a normalizing flow.md new file mode 100644 index 0000000..1078bfc --- /dev/null +++ b/examples/Training a normalizing flow.md @@ -0,0 +1,29 @@ +# Training normalizing flow models on a dataset + +We demonstrate how to train a normalizing flow on a dataset. +We use Real NVP as an example, but you can replace it with any other bijection from `normalizing_flows.bijections`. +The code is as follows: + +```python +import torch +from normalizing_flows import Flow +from normalizing_flows.bijections import RealNVP + +torch.manual_seed(0) + +# We support arbitrary event and batch shapes +event_shape = (2, 3) +batch_shape = (5, 7) +x_train = torch.randn(size=(*batch_shape, *event_shape)) + +bijection = RealNVP(event_shape=event_shape) +flow = Flow(bijection) + +flow.fit(x_train, show_progress=True) +``` + +To modify the learning rate, simply use the `lr` keyword argument in `flow.fit(...)`: + +```python +flow.fit(x_train, show_progress=True, lr=0.001) +``` \ No newline at end of file