Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose the exact normalizations required to achieve reproduction #83

Open
msuozzo opened this issue Aug 21, 2024 · 0 comments
Open

Expose the exact normalizations required to achieve reproduction #83

msuozzo opened this issue Aug 21, 2024 · 0 comments
Assignees

Comments

@msuozzo
Copy link
Member

msuozzo commented Aug 21, 2024

We currently apply all relevant normalizations to all artifacts meaning we neither know nor convey to the user which were the minimal set of normalizations that ended up being necessary to achieve an identical artifact.

Determining the minimal set of normalizations could be tricky but i think the best strategy would be to decompose our monolithic normalizer into a set of "passes" and, as we're normalizing each artifact, evaluate the similarity of the artifacts at each step. When passes 'do nothing', the degree of similarity should remain unchanged when the pass is applied.

I don't believe this approach would work for passes applied in a random order but I do think we could order them such that the similarity would monotonically increase.

Credit to @hboutemy for the idea 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant