-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TransformVariables-like example? #197
Comments
# t = TV.as((a = as(d1), b = as(d2)))
t = NamedBijector((a = bijector(d1), b = bijector(d2))) # x = randn(TV.dimension(t))
x = (a = rand(transformed(d1)), b = rand(transformed(d2))) We're not assuming The rest is the same. But there are redundancies in Bijectors.jl, e.g. EDIT: I'm actually a bit uncertain how the |
Thanks @torfjelde . For performance, I think it's important for the transformation to work in terms of a local variable, like an iteration. For example, here's TransformVariables on a Simplex (https://github.com/tpapp/TransformVariables.jl/blob/master/src/special_arrays.jl#L100): function transform_with(flag::LogJacFlag, t::UnitSimplex, x::AbstractVector, index)
@unpack n = t
T = extended_eltype(x)
ℓ = logjac_zero(flag, T)
stick = one(T)
y = Vector{T}(undef, n)
@inbounds for i in 1:n-1
xi = x[index]
index += 1
z = logistic(xi - log(n-i))
y[i] = z * stick
if !(flag isa NoLogJac)
ℓ += log(stick) - logit_logjac(z)
end
stick *= 1 - z
end
y[end] = stick
y, ℓ, index
end In this way, If you have a composition of bijectors ending with a
I think this is an orthogonal concern, TransformVariables works great with Distributions. For example, julia> t = as(Dirichlet(Fill(0.3,3)))
TransformVariables.UnitSimplex(3)
julia> x = TV.transform(t, randn(2))
3-element Vector{Float64}:
0.18136540387191707
0.08052913397901988
0.7381054621490631
julia> logdensity(Dists.Dirichlet(Fill(0.3,3)), x)
-0.04998534448868064 In general, a lot of this is about manifold embeddings. If I'm embedding EDIT: It's partly a "want", but maybe more a need. It seems very confusing and error-prone to me otherwise. |
Yeah because we could just make vector-versions of the ones that really work in a lower-dim space, and then just make the current implementations those composed with a reshape or whatever.
Sorry, compatiblility was the wrong word: I meant "consistency". I didn't implement these btw 😅 All I know is that we did it because Distributions.jl includes the last element.
Again, 100% agree and it's unfortunate that it's not the way we're doing it atm. Hence #183:) |
Great! And I think we had already talked about having bijectors write into preallocated memory, so the transformation can be allocation-free. Oh, and one more thing... I think all of the dimensions will usually be known statically, in which case it makes sense to have some generated functions unrolling the loops, using LoopVectorization etc. I can add those if you're not planning it already, I'm doing things like that in MultrivariateMeasures anyway and I feel like I'm starting to get the hang of it :) So... I guess I should wait for #183? |
Yeah, waiting for #183 is probably a good idea:) Hopefully soonTM! |
Sounds good :) |
I'm looking into transitioning MeasureTheory to use Bijectors instead of TransformVariables, but there are some differences between the packages that are really throwing me. First, some questions I had asked on Slack:
Can someone give some more detail on this?
Maybe what would help the most is a most detailed example. Here's something I can currently do pretty easily in MeasureTheory, using TransformVariables. What would this look like with Bijectors?
The text was updated successfully, but these errors were encountered: