-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reparameterization of extrinsic parameter for better sampling efficiency #131
Reparameterization of extrinsic parameter for better sampling efficiency #131
Conversation
@kazewong @xuyuon Actually we may need to add some checking for the additional parameters needed in the transform. For example, sampling spin to Cartesian spin takes masses as input for the transform, but we don't check if masses are in the parameters now. Same problem exist for the transformation implemented in this PR. |
@thomasckng this PR is still a work in progress, so you do not need to approve it yet. I will ping u all once everything, e.g. tests etc are ready |
Regarding the folding approach for the phasing and angles, I think it would be better to introduce the discrete MCMC rather than the numeric marginalization approach. |
I would like to add that it might be better to allow the user to have the freedom to choose the ordering of the forward and inverse transform. That would be very useful for the SNR-weighted distance transform. |
I think this is related to the discussion in the last development meeting and also #130. Adding a function to switch the direction of transform can be implemented. I can submit a PR for that tomorrow. |
This is more of a styling problem that I want to ask for a discussion. @kazewong @thomasckng @xuyuon Since quite a few transformations for this extrinsic re-parameterization are conditioned on other parameters, e.g., t_det <-> t_c needs also ra, dec, the current implementation for Jim needs to be fixed.
Therefore, we would need to pass everything, from start to finish, or an additional condition list. The conditional list might be helpful for the transform ordering, which @thomasckng is working on. |
To clarify, the situation here is we want to define a bijective transform with conditional parameters, in the sense that only the N parameters will be transformed according to the name mapping, but then there are some extra parameters needed, that should not be transformed, right? I am wondering whether we want to roll out something like the following class ConditionalBijective(Bijective):
conditional_params: list[str]
def __init__(self,
named_params,
conditional_params):
... This sounds like it is exactly what we need? |
Yes, that's indeed what we need. We can either create that new class or just add the |
I think subclassing bijective transform is a better option here, since normal transform is not necessary conditional. I can get to this today and try merging into this working branch |
At line, the jacobian calculation also includes the conditional parameters, but that should be excluded. |
Do you want to just change those lines? I think I am happy with the API, and it doesn't seem to be a complicated change. |
I have mentioned the iota problem in the meeting, but I don't think I have presented it well, so let me explain it clearly here. The issue is related to the two transformations: Let's focus on the following parameters: a1, a2, phi12, theta1, theta2, phi_jl, theta_jn, phase_c. These are the parameters on which I have defined priors, and I'll be focusing on the To get The forward transform path will look like this:
This solves the problem for the forward transform. However, for the inverse transform, it gets a bit more complicated. We start with the parameters: The key question therefore is: Is it possible to perform the inverse transform of Here is a list of the parameters available at that time: |
…c_parameter_sampling_improvement
This reverts commit 762b7e0.
@tsunhopang I am sorry. I just realised that when I was trying to pull the new features from your branch to mine for testing a few days ago, I accidentally pushed everything in my branch into your branch. My branch is far from ready. I am very sorry that I messed up your branch. Please feel free to revert all the changes I made. |
Closing this PR in favor of #161 |
This pull request makes use of the new priors and transforms system introduced in #108 for better sampling efficiency and aid the NF learning in flowMC.
See https://arxiv.org/abs/2207.03508 for reference.
This PR now includes
more to come