[ML-DSA] Use all of commitment hash to sample verifiers challenge #600
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This addresses #497, by passing in all of the commitment hash instead of just the first 32 bytes.
I've also increased the rejection sample loop bound in signing to the value prescribed by the standard. For the other rejection sampling loops described in the standard we don't put a limit, so this change also addresses #572.
I wonder about our test vectors with these changes: We can update our self-generated set (via the python spec) bit by bit along with our implementation, as we incorporate these changes. For external sources of test vectors, like the Wycheproof ones, we have vectors for the draft version and there seem to be vectors for the final standard in the same PR where we got the draft vectors (C2SP/wycheproof#112). Should we disable these tests and do changes bit by bit, or do all changes in one go, also updating the test vectors to their final version?
Update: I'm going to be addressing the changes in very short order, so I've disabled the old Wycheproof tests and filed an issue to re-enable them (updated to the standard vectors), once all changes are in: #607.