Skip to content

Commit

Permalink
CompatHelper: bump compat for AbstractPPL to 0.6 for package test, (k…
Browse files Browse the repository at this point in the history
…eep existing compat) (#469)

* Fixed a typo in tutorial (#451)

* CompatHelper: bump compat for Turing to 0.24 for package turing, (keep existing compat) (#450)

This pull request changes the compat entry for the `Turing` package from `0.21` to `0.21, 0.24` for package turing.
This keeps the compat entries for earlier versions.



Note: I have not tested your package with this new compat entry.
It is your responsibility to make sure that your package tests pass before you merge this pull request.

Co-authored-by: Hong Ge <[email protected]>

* Some minor utility improvements (#452)

This PR does the following:
- Moves the `varname_leaves` from `TestUtils` to main module.
  - It can be very useful in Turing.jl for constructing `Chains` and the like, so I think it's a good idea to make it part of the main module rather than keeping it "hidden" there.
- Makes the default `varinfo` in the constructor of `LogDensityFunction` be `model.context` rather than a new `DynamicPPL.DefaultContext`.
  - The `context` pass to `evaluate!!` will override the leaf-context in `model.context`, and so the current default constructor always uses `DefaultContext` as the leaf-context, even if the `Model` has been `contextualize`d with some other leaf-context, e.g. `PriorContext`. This PR fixes this issue.

* Always run CI  (#453)

I find the current `bors` workflow a bit tedious. Most of the time, I summon `bors` to see the CI results (see e.g. #438). Given that most `CI` tests are quick (< 10mins), we can always run them by default. 

The most time-consuming `IntegrationTests` is still run by `bors` to avoid excessive CI runs.

* Compat with new Bijectors.jl (#454)

This PR makes DPPL compatible with the changes to come in TuringLang/Bijectors.jl#214.

Tests are passing locally.

Closes #455 Closes #456

* Another Bijectors.jl compat bound bump (#457)

* CompatHelper: bump compat for MCMCChains to 6 for package test, (keep existing compat) (#467)

This pull request changes the compat entry for the `MCMCChains` package from `4.0.4, 5` to `4.0.4, 5, 6` for package test.
This keeps the compat entries for earlier versions.



Note: I have not tested your package with this new compat entry.
It is your responsibility to make sure that your package tests pass before you merge this pull request.

Co-authored-by: Hong Ge <[email protected]>

* CompatHelper: bump compat for AbstractPPL to 0.6 for package test, (keep existing compat)

---------

Co-authored-by: Hong Ge <[email protected]>
Co-authored-by: github-actions[bot] <[email protected]>
Co-authored-by: Tor Erlend Fjelde <[email protected]>
  • Loading branch information
4 people authored Mar 2, 2023
1 parent 212f9f5 commit ee923e4
Show file tree
Hide file tree
Showing 13 changed files with 81 additions and 45 deletions.
3 changes: 3 additions & 0 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,9 @@ on:
- trying
# Build the master branch.
- master
pull_request:
branches:
- master

jobs:
test:
Expand Down
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name = "DynamicPPL"
uuid = "366bfd00-2699-11ea-058f-f148b4cae6d8"
version = "0.21.5"
version = "0.22.1"

[deps]
AbstractMCMC = "80f14c24-f653-4e6a-9b94-39d6b0f70001"
Expand All @@ -24,7 +24,7 @@ ZygoteRules = "700de1a5-db45-46bc-99cf-38207098b444"
AbstractMCMC = "2, 3.0, 4"
AbstractPPL = "0.5.3, 0.6"
BangBang = "0.3"
Bijectors = "0.5.2, 0.6, 0.7, 0.8, 0.9, 0.10"
Bijectors = "0.11, 0.12"
ChainRulesCore = "0.9.7, 0.10, 1"
ConstructionBase = "1"
Distributions = "0.23.8, 0.24, 0.25"
Expand Down
4 changes: 2 additions & 2 deletions docs/src/tutorials/prob-interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ end
nothing # hide
```

We generate some data using `μ = 0` and `σ = 1`:
We generate some data using `μ = 0`:

```@example probinterface
Random.seed!(1776)
Expand All @@ -35,7 +35,7 @@ Conditioning takes a variable and fixes its value as known.
We do this by passing a model and a collection of conditioned variables to [`|`](@ref) or its alias [`condition`](@ref):

```@example probinterface
model = gdemo(length(dataset)) | (x=dataset, μ=0, σ=1)
model = gdemo(length(dataset)) | (x=dataset, μ=0)
nothing # hide
```

Expand Down
9 changes: 4 additions & 5 deletions src/abstract_varinfo.jl
Original file line number Diff line number Diff line change
Expand Up @@ -405,7 +405,7 @@ end

# Vector-based ones.
function link!!(
t::StaticTransformation{<:Bijectors.Bijector{1}},
t::StaticTransformation{<:Bijectors.Transform},
vi::AbstractVarInfo,
spl::AbstractSampler,
model::Model,
Expand All @@ -420,7 +420,7 @@ function link!!(
end

function invlink!!(
t::StaticTransformation{<:Bijectors.Bijector{1}},
t::StaticTransformation{<:Bijectors.Transform},
vi::AbstractVarInfo,
spl::AbstractSampler,
model::Model,
Expand Down Expand Up @@ -452,9 +452,8 @@ julia> using DynamicPPL, Distributions, Bijectors
julia> @model demo() = x ~ Normal()
demo (generic function with 2 methods)
julia> # By subtyping `Bijector{1}`, we inherit the `(inv)link!!` defined for
# bijectors which acts on 1-dimensional arrays, i.e. vectors.
struct MyBijector <: Bijectors.Bijector{1} end
julia> # By subtyping `Transform`, we inherit the `(inv)link!!`.
struct MyBijector <: Bijectors.Transform end
julia> # Define some dummy `inverse` which will be used in the `link!!` call.
Bijectors.inverse(f::MyBijector) = identity
Expand Down
10 changes: 8 additions & 2 deletions src/logdensityfunction.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ $(FIELDS)
```jldoctest
julia> using Distributions
julia> using DynamicPPL: LogDensityFunction
julia> using DynamicPPL: LogDensityFunction, contextualize
julia> @model function demo(x)
m ~ Normal()
Expand All @@ -36,6 +36,12 @@ julia> # By default it uses `VarInfo` under the hood, but this is not necessary.
julia> LogDensityProblems.logdensity(f, [0.0])
-2.3378770664093453
julia> # This also respects the context in `model`.
f_prior = LogDensityFunction(contextualize(model, DynamicPPL.PriorContext()), VarInfo(model));
julia> LogDensityProblems.logdensity(f_prior, [0.0]) == logpdf(Normal(), 0.0)
true
```
"""
struct LogDensityFunction{V,M,C}
Expand All @@ -60,7 +66,7 @@ end
function LogDensityFunction(
model::Model,
varinfo::AbstractVarInfo=VarInfo(model),
context::AbstractContext=DefaultContext(),
context::AbstractContext=model.context,
)
return LogDensityFunction(varinfo, model, context)
end
Expand Down
4 changes: 2 additions & 2 deletions src/simple_varinfo.jl
Original file line number Diff line number Diff line change
Expand Up @@ -648,7 +648,7 @@ Distributions.loglikelihood(model::Model, θ) = loglikelihood(model, SimpleVarIn

# Allow usage of `NamedBijector` too.
function link!!(
t::StaticTransformation{<:Bijectors.NamedBijector},
t::StaticTransformation{<:Bijectors.NamedTransform},
vi::SimpleVarInfo{<:NamedTuple},
spl::AbstractSampler,
model::Model,
Expand All @@ -663,7 +663,7 @@ function link!!(
end

function invlink!!(
t::StaticTransformation{<:Bijectors.NamedBijector},
t::StaticTransformation{<:Bijectors.NamedTransform},
vi::SimpleVarInfo{<:NamedTuple},
spl::AbstractSampler,
model::Model,
Expand Down
24 changes: 3 additions & 21 deletions src/test_utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,26 +10,8 @@ using Random: Random
using Bijectors: Bijectors
using Setfield: Setfield

"""
varname_leaves(vn::VarName, val)
Return iterator over all varnames that are represented by `vn` on `val`,
e.g. `varname_leaves(@varname(x), rand(2))` results in an iterator over `[@varname(x[1]), @varname(x[2])]`.
"""
varname_leaves(vn::VarName, val::Real) = [vn]
function varname_leaves(vn::VarName, val::AbstractArray{<:Union{Real,Missing}})
return (
VarName(vn, DynamicPPL.getlens(vn) Setfield.IndexLens(Tuple(I))) for
I in CartesianIndices(val)
)
end
function varname_leaves(vn::VarName, val::AbstractArray)
return Iterators.flatten(
varname_leaves(
VarName(vn, DynamicPPL.getlens(vn) Setfield.IndexLens(Tuple(I))), val[I]
) for I in CartesianIndices(val)
)
end
# For backwards compat.
using DynamicPPL: varname_leaves

"""
update_values!!(vi::AbstractVarInfo, vals::NamedTuple, vns)
Expand Down Expand Up @@ -704,7 +686,7 @@ Simple model for which [`default_transformation`](@ref) returns a [`StaticTransf
end

function DynamicPPL.default_transformation(::Model{typeof(demo_static_transformation)})
b = Bijectors.stack(Bijectors.Exp{0}(), Bijectors.Identity{0}())
b = Bijectors.stack(Bijectors.elementwise(exp), identity)
return DynamicPPL.StaticTransformation(b)
end

Expand Down
46 changes: 46 additions & 0 deletions src/utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -740,3 +740,49 @@ infer_nested_eltype(::Type{<:AbstractDict{<:Any,ET}}) where {ET} = infer_nested_

# No need + causes issues for some AD backends, e.g. Zygote.
ChainRulesCore.@non_differentiable infer_nested_eltype(x)

"""
varname_leaves(vn::VarName, val)
Return an iterator over all varnames that are represented by `vn` on `val`.
# Examples
```jldoctest
julia> using DynamicPPL: varname_leaves
julia> foreach(println, varname_leaves(@varname(x), rand(2)))
x[1]
x[2]
julia> foreach(println, varname_leaves(@varname(x[1:2]), rand(2)))
x[1:2][1]
x[1:2][2]
julia> x = (y = 1, z = [[2.0], [3.0]]);
julia> foreach(println, varname_leaves(@varname(x), x))
x.y
x.z[1][1]
x.z[2][1]
```
"""
varname_leaves(vn::VarName, ::Real) = [vn]
function varname_leaves(vn::VarName, val::AbstractArray{<:Union{Real,Missing}})
return (
VarName(vn, getlens(vn) Setfield.IndexLens(Tuple(I))) for
I in CartesianIndices(val)
)
end
function varname_leaves(vn::VarName, val::AbstractArray)
return Iterators.flatten(
varname_leaves(VarName(vn, getlens(vn) Setfield.IndexLens(Tuple(I))), val[I]) for
I in CartesianIndices(val)
)
end
function varname_leaves(vn::DynamicPPL.VarName, val::NamedTuple)
iter = Iterators.map(keys(val)) do sym
lens = Setfield.PropertyLens{sym}()
varname_leaves(vn lens, get(val, lens))
end
return Iterators.flatten(iter)
end
8 changes: 4 additions & 4 deletions test/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,17 @@ Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

[compat]
AbstractMCMC = "2.1, 3.0, 4"
AbstractPPL = "0.5.1, 0.6"
Bijectors = "0.9.5, 0.10"
AbstractPPL = "0.5, 0.6"
Bijectors = "0.11, 0.12"
Distributions = "0.25"
DistributionsAD = "0.6.3"
Documenter = "0.26.1, 0.27"
ForwardDiff = "0.10.12"
LogDensityProblems = "2"
MCMCChains = "4.0.4, 5"
MCMCChains = "4.0.4, 5, 6"
MacroTools = "0.5.5"
Setfield = "0.7.1, 0.8, 1"
StableRNGs = "1"
Tracker = "0.2.11"
Tracker = "0.2.23"
Zygote = "0.5.4, 0.6"
julia = "1.6"
3 changes: 3 additions & 0 deletions test/simple_varinfo.jl
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@
@testset "$(typeof(vi))" for vi in (
SimpleVarInfo(Dict()), SimpleVarInfo(values_constrained), VarInfo(model)
)
vi = SimpleVarInfo(values_constrained)
for vn in DynamicPPL.TestUtils.varnames(model)
vi = DynamicPPL.setindex!!(vi, get(values_constrained, vn), vn)
end
Expand Down Expand Up @@ -108,6 +109,8 @@

@testset "SimpleVarInfo on $(nameof(model))" for model in
DynamicPPL.TestUtils.DEMO_MODELS
model = DynamicPPL.TestUtils.demo_dot_assume_matrix_dot_observe_matrix()

# We might need to pre-allocate for the variable `m`, so we need
# to see whether this is the case.
svi_nt = SimpleVarInfo(rand(NamedTuple, model))
Expand Down
7 changes: 2 additions & 5 deletions test/test_util.jl
Original file line number Diff line number Diff line change
Expand Up @@ -13,11 +13,8 @@ function test_model_ad(model, logp_manual)
x = DynamicPPL.getall(vi)

# Log probabilities using the model.
function logp_model(x)
new_vi = VarInfo(vi, SampleFromPrior(), x)
model(new_vi)
return getlogp(new_vi)
end
= DynamicPPL.LogDensityFunction(model, vi)
logp_model = Base.Fix1(LogDensityProblems.logdensity, ℓ)

# Check that both functions return the same values.
lp = logp_manual(x)
Expand Down
2 changes: 1 addition & 1 deletion test/turing/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,5 @@ Turing = "fce5fe82-541a-59a6-adf8-730c64b5f9a0"

[compat]
DynamicPPL = "0.20, 0.21"
Turing = "0.21"
Turing = "0.21, 0.22, 0.23, 0.24"
julia = "1.6"
2 changes: 1 addition & 1 deletion test/turing/compiler.jl
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@
x = Float64[1 2]

@model function gauss(x)
priors = TArray{Float64}(2)
priors = Array{Float64}(undef, 2)
priors[1] ~ InverseGamma(2, 3) # s
priors[2] ~ Normal(0, sqrt(priors[1])) # m
for i in 1:length(x)
Expand Down

0 comments on commit ee923e4

Please sign in to comment.