Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: plume with poseidon2 hashes #8

Draft
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

signorecello
Copy link

As discussed with @yush_g we wanted to try PLUME with poseidon hashes, this is the PR. Still pending:

  • Noir bigcurve has a pending PR to update bignum to 0.4.2
  • I guess changes to the ERC itself, but we can merge this to a "poseidon" branch in the meantime

@signorecello signorecello marked this pull request as draft November 26, 2024 15:48
@signorecello
Copy link
Author

I also have no idea how to update the SAGE scripts to use poseidon, I suppose it's too much to ask for a SAGE implementation of it :D @NikitaMasych any idea how could we get around this?

@NikitaMasych
Copy link
Member

Hey, nice to see this PR; I'll take an extensive look a bit later, but at the first glance looks legit 🙂

@NikitaMasych
Copy link
Member

NikitaMasych commented Nov 26, 2024

I also have no idea how to update the SAGE scripts to use poseidon, I suppose it's too much to ask for a SAGE implementation of it :D @NikitaMasych any idea how could we get around this?

I'll see that I can do; sage is merely an exemplary in this repository, so this shouldn't be a burden to deliver the feature of this PR

@@ -109,7 +110,7 @@ fn sha256_12_coordinates(
}
}

sha256(res)
hash([to_field(res)], res.len())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like we are missing on preimage data when doing conversion to field; res being 198 bytes of information is narrowed down to the prime order of underlying field of appr. 8 bytes for bb prior to hashing. Additionally, here is passed only one field element in array, but res.len() will give 198.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see, so in this case do you suggest I break it to limbs and pass it as an array maybe?

@@ -52,7 +53,7 @@ fn msg_prime<let N: u32>(msg: [u8; N]) -> [u8; 32] {
preimage[64 + N + 2 + 1 + i] = DST_PRIME[i];
}

sha256(preimage)
hash([to_field(preimage)], preimage.len()).to_le_bytes()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is not quite part of the PLUME, but rather of the hash-to-curve algorithm, it might be not conforming to requirements given in spec. Though, I am not particularly sure

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think if I hash the whole thing by breaking it into limbs and hashing the limbs, it would conform to the spec. But I am not a cryptographer, so I may be wrong

@@ -80,7 +81,7 @@ fn hash_b(b_idx: u8, b: [u8; 32]) -> [u8; 32] {
preimage[32 + 1 + i] = DST_PRIME[i];
}

sha256(preimage)
hash([to_field(preimage)], 32 + 1 + 50).to_le_bytes()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same about conforming spec as in msg_prime

@@ -3,7 +3,7 @@ use noir_bigcurve::BigCurve;
use noir_bigcurve::curves::secp256k1::{Secp256k1, Secp256k1Fr, Secp256k1Scalar};
use noir_bigcurve::scalar_field::ScalarField;

use plume::plume_v1;
use plume::{plume_v1, to_field};
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess here import shall be use plume::utils::to_field to resolve is private and not visible from the current module warning.

@@ -109,7 +110,7 @@ fn sha256_12_coordinates(
}
}

sha256(res)
hash([to_field(res)], res.len())
}

#[test]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regarding all commented-out tests here: they need to be updated and passing nargo test, if uncommented. Currently, there are few data and compilation issues. Sorry for having to deal with commented code, I guess if noir'll support something akin to rust's #[ignore] macro in future, I'll update these

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No problem, feel free to update this branch too. I'm just an Aztec team member on my time off learning a bit about PLUME stuff 👍

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, can you grant me write access to your fork?

as_field += (bytes[index] as Field) * offset;
offset *= 256;
}
std::as_witness(as_field);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just curious, why need this line?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This little guy shaves off 200k constraints out of the whole thing 😄 a team member found a bug in the compiler that was causing a huge blowout. Using as_witness forces the compiler to assign intermediate values, which makes it behave as it should

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wow, nice trick

@signorecello
Copy link
Author

Thanks @NikitaMasych the reason I'm picking this up is mostly to learn, as I am playing around with Stealthdrop on the phone and thus felt the need to make it less costly. Stealthdrop is also one of the noir examples and the existent version is very outdated (and unsafe), so I'd like to update that too

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants