Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Widening the scope of the package and dropping support for batching #214

Merged
merged 108 commits into from
Feb 1, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
108 commits
Select commit Hold shift + click to select a range
1d9a1a3
renamed rv to result in forward
torfjelde Jun 5, 2021
0717e3e
added abstrac type Transform and removed dimensionality from Bijector
torfjelde Jun 5, 2021
81a2ed6
updated Composed to new interface
torfjelde Jun 5, 2021
251ab9c
updated Exp and Log to new interface
torfjelde Jun 5, 2021
f1ef968
updated Logit to new interface
torfjelde Jun 5, 2021
45ff364
removed something that shouldnt be there
torfjelde Jun 5, 2021
eb94e00
removed false statement in docstring of Transform
torfjelde Jun 5, 2021
0d8783f
fixed a typo in implementation of logabsdetjac_batch
torfjelde Jun 5, 2021
8f9988e
added types for representing batches
torfjelde Jun 5, 2021
9fa37d1
make it possible to use broadcasting for working with batches
torfjelde Jun 5, 2021
168dd43
updated SimplexBijector to new interface, I think
torfjelde Jun 5, 2021
d44cf42
updated PDBijector to new interface
torfjelde Jun 5, 2021
c719b07
use transform_batch rather than broadcasting
torfjelde Jun 5, 2021
0962f06
Merge branch 'master' into tor/rewrite
torfjelde Jun 5, 2021
0a62e96
added default implementations for batches
torfjelde Jun 5, 2021
21a66ab
updated ADBijector to new interface
torfjelde Jun 5, 2021
0b04d14
updated CorrBijector to new interface
torfjelde Jun 5, 2021
2cfd24b
updated Coupling to new interface
torfjelde Jun 5, 2021
c272cd4
updated LeakyReLU to new interface
torfjelde Jun 5, 2021
5e2a585
updated NamedBijector to new interface
torfjelde Jun 5, 2021
8e41a50
updated BatchNormalisation to new interface
torfjelde Jun 5, 2021
9f45b16
updated Permute to new interface
torfjelde Jun 5, 2021
f793dad
updated PlanarLayer to new interface
torfjelde Jun 5, 2021
19d1ef1
updated RadialLayer to new interface
torfjelde Jun 5, 2021
5015551
updated RationalQuadraticSpline to new interface
torfjelde Jun 5, 2021
3b75526
updated Scale to new interface
torfjelde Jun 5, 2021
195a107
updated Shift to new interface
torfjelde Jun 5, 2021
f56ed6a
updated Stacked to new interface
torfjelde Jun 5, 2021
72d68b8
updated TruncatedBijector to new interface
torfjelde Jun 5, 2021
214aa92
added ConstructionBase as dependency
torfjelde Jun 5, 2021
836e152
fixed a bunch of small typos and errors from previous commits
torfjelde Jun 5, 2021
ff6b756
forgot to wrap some in Batch
torfjelde Jun 5, 2021
989aaa8
allow inverses of non-bijectors
torfjelde Jun 6, 2021
4d23882
relax definition of VectorBatch so Vector{<:Real} is covered
torfjelde Jun 6, 2021
0f9d334
just perform invertibility check in Inverse rather than inv
torfjelde Jun 6, 2021
af0b24b
moved some code arround
torfjelde Jun 6, 2021
0777fab
added docstrings and default impls for mutating batched methods
torfjelde Jun 6, 2021
42839c3
add elementype to VectorBatch
torfjelde Jun 6, 2021
2ce74d4
simplify Shift bijector
torfjelde Jun 6, 2021
926ef27
added rrules for logabsdetjac_shift
torfjelde Jun 6, 2021
c3745d7
use type-stable implementation of eachslice
torfjelde Jun 6, 2021
1600986
initial work on adding proper testing
torfjelde Jun 6, 2021
2f4d328
make Batch compatible with Zygote
torfjelde Jun 6, 2021
6da2498
Merge branch 'master' into tor/rewrite
torfjelde Aug 1, 2021
e5439f5
updated OrderedBijector
torfjelde Aug 1, 2021
dbf06d9
Merge branch 'master' into tor/rewrite
torfjelde Dec 25, 2021
5681358
temporary stuff
torfjelde Jan 24, 2022
306aa66
added docs
torfjelde Jan 24, 2022
8cd371a
removed all batch related functionality
torfjelde Feb 9, 2022
190f1a5
move bijectors over to with_logabsdet_jacobian and drop official batc…
torfjelde Feb 11, 2022
52c8ed7
updated compat
torfjelde Feb 11, 2022
99a421e
updated tests
torfjelde Feb 11, 2022
a28c9b1
updated docs
torfjelde Feb 11, 2022
b553b08
removed reundndat dep
torfjelde Feb 11, 2022
77fbdb6
remove batch
torfjelde Feb 11, 2022
3e8d65d
remove redundant defs of transform
torfjelde Feb 11, 2022
fa469b8
removed unnecessary impls of with_logabsdet_jacobian
torfjelde Feb 11, 2022
0a5d55e
remove usage of Exp and Log in tests
torfjelde Feb 11, 2022
b4703c5
Merge branch 'master' into tor/write-without-batch
torfjelde Jul 18, 2022
d63c07e
fixed docs
torfjelde Jul 19, 2022
c00b9f2
added bijectors with docs to docs
torfjelde Jul 19, 2022
0a0858d
small change to docs
torfjelde Jul 20, 2022
a53f971
fixed bug in computation of logabsdetjac of truncated
torfjelde Jul 20, 2022
99765f3
bump minor version
torfjelde Jul 20, 2022
21d21fc
run GH actions on Julia 1.6, which is the new LTS, instead of 1.3
torfjelde Aug 18, 2022
34bb350
added Github actions for making docs, etc.
torfjelde Aug 18, 2022
c5046f5
removed left-overs from batch impls
torfjelde Aug 18, 2022
7956d05
removed redundant comment
torfjelde Aug 18, 2022
82f8ba8
dont return NamedTuple from with_logabsdet_jacobian
torfjelde Aug 18, 2022
313d533
Merge branch 'tor/update-github-actions' into tor/write-without-batch
torfjelde Aug 19, 2022
d5d2274
remove unnused methods
torfjelde Aug 19, 2022
0c2d829
Merge branch 'master' into tor/write-without-batch
torfjelde Aug 24, 2022
39746db
remove old deprecation warnings
torfjelde Aug 24, 2022
859a6ba
fix exports
torfjelde Aug 24, 2022
0f5f9f1
updated tests for deprecations
torfjelde Aug 24, 2022
6717172
completed some random TODOs
torfjelde Aug 24, 2022
d444a3a
Merge branch 'master' into tor/write-without-batch
torfjelde Oct 5, 2022
fe2b5e9
fix SimplexBijector tests
torfjelde Oct 6, 2022
a9be9c9
removed whitespace
torfjelde Oct 6, 2022
a30d56d
made some docstrings into doctests
torfjelde Oct 7, 2022
c0f11f4
removed unnused method
torfjelde Oct 7, 2022
5397d33
improved show for scale and shift
torfjelde Oct 7, 2022
901a6ef
converted example for Coupling into doctest
torfjelde Oct 7, 2022
557f826
added reference to Coupling bijector for NamedCoupling
torfjelde Oct 7, 2022
92651e8
fixed docstring
torfjelde Oct 7, 2022
3719f33
fixed documentation setup
torfjelde Oct 7, 2022
f26d5a6
nvm, now I fixed documentation setup
torfjelde Oct 7, 2022
ad3ecc9
removed references to dimensionality in code
torfjelde Oct 7, 2022
95ff4b6
fixed typo
torfjelde Oct 7, 2022
c8c3bdc
add impl of invertible for Elementwise
torfjelde Oct 7, 2022
43d204f
added transforms and distributions as separate pages in docs
torfjelde Oct 9, 2022
29a0b59
removed all the unnecessary stuff in README
torfjelde Oct 10, 2022
871874e
added examples to docs
torfjelde Oct 10, 2022
84d6863
added some show methods for certain bijectors
torfjelde Oct 10, 2022
80bea94
added compat entries to docs
torfjelde Oct 10, 2022
e0e1792
updated docstring for RationalQuadraticSpline
torfjelde Oct 10, 2022
fc47633
removed commented code
torfjelde Oct 10, 2022
f0bc08b
Merge branch 'master' into tor/write-without-batch
torfjelde Oct 10, 2022
161b6f1
remove reference to logpdf_forward
torfjelde Oct 10, 2022
ecb54d3
remove enforcement of type of input and output being the same in tests
torfjelde Oct 11, 2022
7450eeb
Merge branch 'master' into tor/write-without-batch
yebai Jan 7, 2023
269f11a
make logpdf_with_trans compatible with logpdf when it comes to
torfjelde Jan 31, 2023
0e66720
Merge branch 'tor/write-without-batch' of github.com:TuringLang/Bijec…
torfjelde Jan 31, 2023
09fe97d
Apply suggestions from code review
torfjelde Feb 1, 2023
de849ee
remove usage of invertible, etc. and use InverseFunctions.NoInverse i…
torfjelde Feb 1, 2023
2ed35b3
specialze transform on Function
torfjelde Feb 1, 2023
6ba5a0b
removed unnecessary show and deprecation warnings
torfjelde Feb 1, 2023
fb979a7
remove references to Log and Exp
torfjelde Feb 1, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions .github/workflows/DocsPreviewCleanup.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: DocsPreviewCleanup

on:
pull_request:
types: [closed]

jobs:
cleanup:
runs-on: ubuntu-latest
steps:
- name: Checkout gh-pages branch
uses: actions/checkout@v2
with:
ref: gh-pages
- name: Delete preview and history + push changes
run: |
if [ -d "previews/PR$PRNUM" ]; then
git config user.name "Documenter.jl"
git config user.email "[email protected]"
git rm -rf "previews/PR$PRNUM"
git commit -m "delete preview"
git branch gh-pages-new $(echo "delete history" | git commit-tree HEAD^{tree})
git push --force origin gh-pages-new:gh-pages
fi
env:
PRNUM: ${{ github.event.number }}
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "Bijectors"
uuid = "76274a88-744f-5084-9051-94815aaf08c4"
version = "0.10.6"
version = "0.11.0"

[deps]
ArgCheck = "dce04be8-c92d-5529-be00-80e4d2c0e197"
Expand Down
252 changes: 1 addition & 251 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# Bijectors.jl

[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://turinglang.github.io/Bijectors.jl/stable)
[![Interface tests](https://github.com/TuringLang/Bijectors.jl/workflows/Interface%20tests/badge.svg?branch=master)](https://github.com/TuringLang/Bijectors.jl/actions?query=workflow%3A%22Interface+tests%22+branch%3Amaster)
[![AD tests](https://github.com/TuringLang/Bijectors.jl/workflows/AD%20tests/badge.svg?branch=master)](https://github.com/TuringLang/Bijectors.jl/actions?query=workflow%3A%22AD+tests%22+branch%3Amaster)

Expand Down Expand Up @@ -135,19 +136,6 @@ true

Pretty neat, huh? `Inverse{Logit}` is also a `Bijector` where we've defined `(ib::Inverse{<:Logit})(y)` as the inverse transformation of `(b::Logit)(x)`. Note that it's not always the case that `inverse(b) isa Inverse`, e.g. the inverse of `Exp` is simply `Log` so `inverse(Exp()) isa Log` is true.

#### Dimensionality
One more thing. See the `0` in `Inverse{Logit{Float64}, 0}`? It represents the *dimensionality* of the bijector, in the same sense as for an `AbstractArray` with the exception of `0` which means it expects 0-dim input and output, i.e. `<:Real`. This can also be accessed through `dimension(b)`:

```julia
julia> Bijectors.dimension(b)
0

julia> Bijectors.dimension(Exp{1}())
1
```

In most cases specification of the dimensionality is unnecessary as a `Bijector{N}` is usually only defined for a particular value of `N`, e.g. `Logit isa Bijector{0}` since it only makes sense to apply `Logit` to a real number (or a vector of reals if you're doing batch-computation). As a user, you'll rarely have to deal with this dimensionality specification. Unfortunately there are exceptions, e.g. `Exp` which can be applied to both real numbers and a vector of real numbers, in both cases treating it as a single input. This means that when `Exp` receives a vector input `x` as input, it's ambiguous whether or not to treat `x` as a *batch* of 0-dim inputs or as a single 1-dim input. As a result, to support batch-computation it is necessary to know the expected dimensionality of the input and output. Notice that we assume the dimensionality of the input and output to be the *same*. This is a reasonable assumption considering we're working with *bijections*.

#### Composition
Also, we can _compose_ bijectors:

Expand Down Expand Up @@ -491,244 +479,6 @@ julia> x, y, logjac, logpdf_y = forward(flow) # sample + transform and returns a
This method is for example useful when computing quantities such as the _expected lower bound (ELBO)_ between this transformed distribution and some other joint density. If no analytical expression is available, we have to approximate the ELBO by a Monte Carlo estimate. But one term in the ELBO is the entropy of the base density, which we _do_ know analytically in this case. Using the analytical expression for the entropy and then using a monte carlo estimate for the rest of the terms in the ELBO gives an estimate with lower variance than if we used the monte carlo estimate for the entire expectation.


### Normalizing flows with bounded support


## Implementing your own `Bijector`
There's mainly two ways you can implement your own `Bijector`, and which way you choose mainly depends on the following question: are you bothered enough to manually implement `logabsdetjac`? If the answer is "Yup!", then you subtype from `Bijector`, if "Naaaah" then you subtype `ADBijector`.

### `<:Bijector`
Here's a simple example taken from the source code, the `Identity`:

```julia
import Bijectors: logabsdetjac

struct Identity{N} <: Bijector{N} end
(::Identity)(x) = x # transform itself, "forward"
(::Inverse{<: Identity})(y) = y # inverse tramsform, "backward"

# see the proper implementation for `logabsdetjac` in general
logabsdetjac(::Identity{0}, y::Real) = zero(eltype(y)) # ∂ₓid(x) = ∂ₓ x = 1 → log(abs(1)) = log(1) = 0
```

A slightly more complex example is `Logit`:

```julia
using LogExpFunctions: logit, logistic

struct Logit{T<:Real} <: Bijector{0}
a::T
b::T
end

(b::Logit)(x::Real) = logit((x - b.a) / (b.b - b.a))
(b::Logit)(x) = map(b, x)
# `orig` contains the `Bijector` which was inverted
(ib::Inverse{<:Logit})(y::Real) = (ib.orig.b - ib.orig.a) * logistic(y) + ib.orig.a
(ib::Inverse{<:Logit})(y) = map(ib, y)

logabsdetjac(b::Logit, x::Real) = - log((x - b.a) * (b.b - x) / (b.b - b.a))
logabsdetjac(b::Logit, x) = map(logabsdetjac, x)
```

(Batch computation is not fully supported by all bijectors yet (see issue #35), but is actively worked on. In the particular case of `Logit` there's only one thing that makes sense, which is elementwise application. Therefore we've added `@.` to the implementation above, thus this works for any `AbstractArray{<:Real}`.)

Then

```julia
julia> b = Logit(0.0, 1.0)
Logit{Float64}(0.0, 1.0)

julia> b(0.6)
0.4054651081081642

julia> inverse(b)(y)
Tracked 2-element Array{Float64,1}:
0.3078149833748082
0.72380041667891

julia> logabsdetjac(b, 0.6)
1.4271163556401458

julia> logabsdetjac(inverse(b), y) # defaults to `- logabsdetjac(b, inverse(b)(x))`
Tracked 2-element Array{Float64,1}:
-1.546158373866469
-1.6098711387913573

julia> with_logabsdet_jacobian(b, 0.6) # defaults to `(b(x), logabsdetjac(b, x))`
(0.4054651081081642, 1.4271163556401458)
```

For further efficiency, one could manually implement `with_logabsdet_jacobian(b::Logit, x)`:

```julia
julia> using Bijectors: Logit

julia> import Bijectors: with_logabsdet_jacobian

julia> function with_logabsdet_jacobian(b::Logit{<:Real}, x)
totally_worth_saving = @. (x - b.a) / (b.b - b.a) # spoiler: it's probably not
y = logit.(totally_worth_saving)
logjac = @. - log((b.b - x) * totally_worth_saving)
return (y, logjac)
end
forward (generic function with 16 methods)

julia> with_logabsdet_jacobian(b, 0.6)
(0.4054651081081642, 1.4271163556401458)

julia> @which with_logabsdet_jacobian(b, 0.6)
with_logabsdet_jacobian(b::Logit{#s4} where #s4<:Real, x) in Main at REPL[43]:2
```

As you can see it's a very contrived example, but you get the idea.

### `<:ADBijector`

We could also have implemented `Logit` as an `ADBijector`:

```julia
using LogExpFunctions: logit, logistic
using Bijectors: ADBackend

struct ADLogit{T, AD} <: ADBijector{AD, 0}
a::T
b::T
end

# ADBackend() returns ForwardDiffAD, which means we use ForwardDiff.jl for AD
ADLogit(a::T, b::T) where {T<:Real} = ADLogit{T, ADBackend()}(a, b)

(b::ADLogit)(x) = @. logit((x - b.a) / (b.b - b.a))
(ib::Inverse{<:ADLogit{<:Real}})(y) = @. (ib.orig.b - ib.orig.a) * logistic(y) + ib.orig.a
```

No implementation of `logabsdetjac`, but:

```julia
julia> b_ad = ADLogit(0.0, 1.0)
ADLogit{Float64,Bijectors.ForwardDiffAD}(0.0, 1.0)

julia> logabsdetjac(b_ad, 0.6)
1.4271163556401458

julia> y = b_ad(0.6)
0.4054651081081642

julia> inverse(b_ad)(y)
0.6

julia> logabsdetjac(inverse(b_ad), y)
-1.4271163556401458
```

Neat! And just to verify that everything works:

```julia
julia> b = Logit(0.0, 1.0)
Logit{Float64}(0.0, 1.0)

julia> logabsdetjac(b, 0.6)
1.4271163556401458

julia> logabsdetjac(b_ad, 0.6) ≈ logabsdetjac(b, 0.6)
true
```

We can also use Tracker.jl for the AD, rather than ForwardDiff.jl:

```julia
julia> Bijectors.setadbackend(:reversediff)
:reversediff

julia> b_ad = ADLogit(0.0, 1.0)
ADLogit{Float64,Bijectors.TrackerAD}(0.0, 1.0)

julia> logabsdetjac(b_ad, 0.6)
1.4271163556401458
```


### Reference
Most of the methods and types mention below will have docstrings with more elaborate explanation and examples, e.g.
```julia
help?> Bijectors.Composed
Composed(ts::A)

∘(b1::Bijector{N}, b2::Bijector{N})::Composed{<:Tuple}
composel(ts::Bijector{N}...)::Composed{<:Tuple}
composer(ts::Bijector{N}...)::Composed{<:Tuple}

where A refers to either

• Tuple{Vararg{<:Bijector{N}}}: a tuple of bijectors of dimensionality N

• AbstractArray{<:Bijector{N}}: an array of bijectors of dimensionality N

A Bijector representing composition of bijectors. composel and composer results in a Composed for which application occurs from left-to-right and right-to-left, respectively.

Note that all the alternative ways of constructing a Composed returns a Tuple of bijectors. This ensures type-stability of implementations of all relating methods, e.g. inverse.

If you want to use an Array as the container instead you can do

Composed([b1, b2, ...])

In general this is not advised since you lose type-stability, but there might be cases where this is desired, e.g. if you have a insanely large number of bijectors to compose.

Examples
≡≡≡≡≡≡≡≡≡≡

It's important to note that ∘ does what is expected mathematically, which means that the bijectors are applied to the input right-to-left, e.g. first applying b2 and then b1:

(b1 ∘ b2)(x) == b1(b2(x)) # => true

But in the Composed struct itself, we store the bijectors left-to-right, so that

cb1 = b1 ∘ b2 # => Composed.ts == (b2, b1)
cb2 = composel(b2, b1) # => Composed.ts == (b2, b1)
cb1(x) == cb2(x) == b1(b2(x)) # => true
```
If anything is lacking or not clear in docstrings, feel free to open an issue or PR.

#### Types
The following are the bijectors available:
- Abstract:
- `Bijector`: super-type of all bijectors.
- `ADBijector{AD} <: Bijector`: subtypes of this only require the user to implement `(b::UserBijector)(x)` and `(ib::Inverse{<:UserBijector})(y)`. Automatic differentation will be used to compute the `jacobian(b, x)` and thus `logabsdetjac(b, x).
- Concrete:
- `Composed`: represents a composition of bijectors.
- `Stacked`: stacks univariate and multivariate bijectors
- `Identity`: does what it says, i.e. nothing.
- `Logit`
- `Exp`
- `Log`
- `Scale`: scaling by scalar value, though at the moment only well-defined `logabsdetjac` for univariate.
- `Shift`: shifts by a scalar value.
- `Permute`: permutes the input array using matrix multiplication
- `SimplexBijector`: mostly used as the constrained-to-unconstrained bijector for `SimplexDistribution`, e.g. `Dirichlet`.
- `PlanarLayer`: §4.1 Eq. (10) in [1]
- `RadialLayer`: §4.1 Eq. (14) in [1]

The distribution interface consists of:
- `TransformedDistribution <: Distribution`: implements the `Distribution` interface from Distributions.jl. This means `rand` and `logpdf` are provided at the moment.

#### Methods
The following methods are implemented by all subtypes of `Bijector`, this also includes bijectors such as `Composed`.
- `(b::Bijector)(x)`: implements the transform of the `Bijector`
- `inverse(b::Bijector)`: returns the inverse of `b`, i.e. `ib::Bijector` s.t. `(ib ∘ b)(x) ≈ x`. In most cases this is `Inverse{<:Bijector}`.
- `logabsdetjac(b::Bijector, x)`: computes log(abs(det(jacobian(b, x)))).
- `with_logabsdet_jacobian(b::Bijector, x)`: returns the tuple `(b(x), logabsdetjac(b, x))` in the most efficient manner.
- `∘`, `composel`, `composer`: convenient and type-safe constructors for `Composed`. `composel(bs...)` composes s.t. the resulting composition is evaluated left-to-right, while `composer(bs...)` is evaluated right-to-left. `∘` is right-to-left, as excepted from standard mathematical notation.
- `jacobian(b::Bijector, x)` [OPTIONAL]: returns the Jacobian of the transformation. In some cases the analytical Jacobian has been implemented for efficiency.
- `dimension(b::Bijector)`: returns the dimensionality of `b`.
- `isclosedform(b::Bijector)`: returns `true` or `false` depending on whether or not `b(x)` has a closed-form implementation.

For `TransformedDistribution`, together with default implementations for `Distribution`, we have the following methods:
- `bijector(d::Distribution)`: returns the default constrained-to-unconstrained bijector for `d`
- `transformed(d::Distribution)`, `transformed(d::Distribution, b::Bijector)`: constructs a `TransformedDistribution` from `d` and `b`.
- `logpdf_forward(d::Distribution, x)`, `logpdf_forward(d::Distribution, x, logjac)`: computes the `logpdf(td, td.transform(x))` using the forward pass, which is potentially faster depending on the transform at hand.
- `forward(d::Distribution)`: returns `(x = rand(dist), y = b(x), logabsdetjac = logabsdetjac(b, x), logpdf = logpdf_forward(td, x))` where `b = td.transform`. This combines sampling from base distribution and transforming into one function. The intention is that this entire process should be performed in the most efficient manner, e.g. the `logabsdetjac(b, x)` call might instead be implemented as `- logabsdetjac(inverse(b), b(x))` depending on which is most efficient.

# Bibliography
1. Rezende, D. J., & Mohamed, S. (2015). Variational Inference With Normalizing Flows. [arXiv:1505.05770](https://arxiv.org/abs/1505.05770v6).
2. Kucukelbir, A., Tran, D., Ranganath, R., Gelman, A., & Blei, D. M. (2016). Automatic Differentiation Variational Inference. [arXiv:1603.00788](https://arxiv.org/abs/1603.00788v1).
2 changes: 1 addition & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ makedocs(
sitename = "Bijectors",
format = Documenter.HTML(),
modules = [Bijectors],
pages = ["Home" => "index.md", "Distributions.jl integration" => "distributions.md", "Examples" => "examples.md"],
pages = ["Home" => "index.md", "Transforms" => "transforms.md", "Distributions.jl integration" => "distributions.md", "Examples" => "examples.md"],
strict=false,
checkdocs=:exports,
)
Expand Down
Loading