Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update and sync-up the recursive verifier #1575

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

Al-Kindi-0
Copy link
Collaborator

As the title says.
We now use the rcomb_base and this leads to faster testing times. Also updated is how the rcomb_base is defined so that the accumulator for the (x - z) quotient is deeper in the stack than the accumulator for the (x - gz) quotient.
Other than that, there are some minor cleanups and re-organization for the testing code.

fix: cleanup and restore security parameters

chore: lower num of fri queries and small edits

chore: cleanup

chore: cleanup
@bobbinth
Copy link
Contributor

Is this ready for review? Or still a draft for now?

@Al-Kindi-0
Copy link
Collaborator Author

Yes, it is ready for review

Copy link
Contributor

@plafer plafer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Not a full review yet - still need to review the masm part)

@@ -75,7 +75,7 @@ where
// --- compute the updated accumulator values ---------------------------------------------
let v0 = self.stack.get(7);
let tx = QuadFelt::new(v0, ZERO);
let [p_new, r_new] = [p + alpha * (tx - tz), r + alpha * (tx - tgz)];
let [r_new, p_new] = [r + alpha * (tx - tz), p + alpha * (tx - tgz)];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can we swap the order to [p_new, r_new] so that it's consistent with line 73 above?

I think the ideal would be to put r before p on the stack (so that the order is always consistent), but just this small swap is probably fine

Comment on lines 359 to 362
r.to_base_elements()[0],
r.to_base_elements()[1],
p.to_base_elements()[0],
p.to_base_elements()[1],
r.to_base_elements()[0],
r.to_base_elements()[1],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 334 and 335 are still incorrect (i.e. p should use tgz and r should use tz). Then I think the diff here should be reverted (since the stack is being built from bottom to top, so r comes before p)

Comment on lines 32 to 36
let proof_context = proof.context.to_elements();
let trace_len: Felt = proof_context[1];
let num_queries = proof_context[7];
let blowup = proof_context[6];
let grinding_factor = proof_context[5];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we call to_elements() and then access the serialized buffer? Could we not access the Context fields directly? This is a bit painful to read, as I have to go understand how to_elements() works to make sure that this is correct. And if to_elements() changes, then this test will start to fail.

(trace_len.as_int() as usize).ilog2() as u64,
];

// build a seed for the public coin; the initial seed is the hash of public inputs and proof
// context, but as the protocol progresses, the coin will be reseeded with the info received
// from the prover
let mut tape = vec![];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can we rename this to advice_stack? (also in VerifierData)

const.OOD_TRACE_PTR=4294900000
const.OOD_CONSTRAINT_EVALS_PTR=4294900081
const.OOD_CONSTRAINT_EVALS_PTR=4294900077
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there an assumption in the code that OOD_TRACE_PTR is right next to OOD_CONSTRAINT_EVALS_PTR? If not, we could put them far away enough such that we don't need to come fix this value everytime we make a change to the shape of the trace.

Similar comment applies to all constants in this file

Copy link
Contributor

@plafer plafer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, great work!

Left a few nits (did not go into detail for the MASM part)

drop
#=> [trace_length, num_queries, blowup, grinding]

# Construct the proof context

##trace layout info
push.1208027408

push.1174472464
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use constants for these magic numbers, and a bit of docs? I'm guessing that we're pushing a value related to Context.trace_info, but I don't understand how 1174472464 encodes the entire TraceInfo?

export.generate_constraint_composition_coefficients

push.236
push.224
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Similarly, can we use a constant for this?


push.92
# note that 88 is the next number after 85 divisible by 4
push.88
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above. It makes it easier to go back and adjust these parameters later (instead of finding them in the code)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants