Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spillings change ? #217

Open
OlivierBBB opened this issue Jun 26, 2024 · 5 comments
Open

Spillings change ? #217

OlivierBBB opened this issue Jun 26, 2024 · 5 comments

Comments

@OlivierBBB
Copy link
Collaborator

OlivierBBB commented Jun 26, 2024

I updated the spillings.toml file and noticed that quite a number of spillings values changed, typically by 1, even with older (mostly?) stable modules, while some changed drastically (e.g. HUB and TXN_DATA)

module key old spilling new spilling down by 1
ADD 2 1 true
BIN 16 15 true
BIN_REFERENCE_TABLE 0 0
BLAKE_MODEXP_DATA 1
BLOCK_DATA 6 6
BLOCK_HASH 1 1
EC_DATA 12 11 true
EUC 1 1
EXP 5 5
EXT 8 7 true
GAS 1
HUB 2 12
INSTRUCTION_DECODER 0 0
LOG_DATA 1 1
LOG_INFO 5 5
MMIO 0
MMU 10 10
MOD 8 7 true
MUL 9 8 true
MXP 4 3 true
OOB 5 5
RLP_ADDR 7 7
RLP_TXN 7 7
RLP_TXN_RCPT 7 7
ROM 2 1 true
ROM_LEX 1 1
SHAKIRA_DATA 1
SHF 16 15 true
SHF_REFERENCE_TABLE 0 0 true
STP 4 4
TRM 7 7
TXN_DATA 15 8
WCP 16 15 true

Question. Did something change in the spillings computation to explain the above ?

@DavePearce
Copy link
Collaborator

DavePearce commented Jun 26, 2024

Did something change in the spillings computation to explain the above ?

Well, that is a question. Certainly there have been changes which could affect spillage. For example, using range constraints for enforcing byte@prove instead of how it was done before is something that could affect spillage. However, I cannot easily tell if that explains what is happening in this case.

Still, I can investigate this and come back with a proper answer.

@OlivierBBB
Copy link
Collaborator Author

OlivierBBB commented Jun 27, 2024

I don't think this warrants an "investigation" per se, unless the spillings fail (which AFAIK we will only know once we connect to the prover.) I'm mostly curious if this is a change for which we have an explanation 🙂

@DavePearce
Copy link
Collaborator

@OlivierBBB We just need to be sure its not the cause of any failures. To be fair, I think the ones we are currently investigating are not related, as these do not require padding to trigger.

@OlivierBBB
Copy link
Collaborator Author

OlivierBBB commented Jul 1, 2024

@DavePearce it's unclear currently if this is the source of errors. To give you context @DavePearce I applied the above and @powerslider suddenly has failing tests. Some (previously successful) test expects the EXT module to contain 9 rows. This test now fails as the trace now contains 8 rows for some reason. 8 is the value I would have expected since the arithmetization does not incorporate a padding row, padding is added later.

Note. The EXT module lost 1 spilling row ☠️ ...

That is: we are unsure whether the spillings could affect the line-count.

@DavePearce
Copy link
Collaborator

DavePearce commented Jul 1, 2024

it's unclear currently if this is the source of errors.

@OlivierBBB Well, the issues that came out from Leo's tests run could all be recreated using corset check. Since that doesn't use padding, those issues don't seem related to this.

However, what I understand from what you are saying above is that there is at least one issue which might be.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants