-
Notifications
You must be signed in to change notification settings - Fork 498
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New Transcript API (and modified commitment scheme) #111
Conversation
Highly recommend reviewing with whitespace changes hidden. |
Codecov Report
@@ Coverage Diff @@
## main #111 +/- ##
==========================================
+ Coverage 83.44% 84.46% +1.02%
==========================================
Files 34 36 +2
Lines 3738 3862 +124
==========================================
+ Hits 3119 3262 +143
+ Misses 619 600 -19
Continue to review full report at Codecov.
|
It would be great for this PR to also add a design page (probably |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
utACK. I have not checked the protocol changes.
Previously, `ChallengeScalar` could use the operator traits defined on the `F: Field` type it wrapped, due to its `impl Deref<Target = F>`. This was technically ambiguous, and Rust 1.49.0 makes that ambiguity an error. We could fix this by adding operator impls with `ChallengeScalar` on the RHS, but that would conflict with #111. Instead we manually dereference every challenge scalar when used in an arithmetic operation.
…anscript API changes.
dd2652c
to
c8dedf2
Compare
This modifies the scheme to be almost identical to the construction outlined in Appenix A.2 of "Proof-Carrying Data from Accumulation Schemes" (https://eprint.iacr.org/2020/499). The only remaining difference is that we do not compute [v] U but instead subtract [v] G_0 from the commitment before opening.
Avoid square challenges in inner product argument
src/poly/commitment/verifier.rs
Outdated
// = [a] G + [-a] H + [abz] U + [h] H | ||
// = [a] G + [abz] U + [h - a] H | ||
// but subtracting to get the desired equality | ||
// ... + [-a] G + [-abz] U + [a - h] H = 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the ...
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's the left hand side of the equality, msm - [v] G_0 + random_poly_commitment * iota + \sum(L_i * u_i^2) + \sum(R_i * u_i^-2)
.
src/poly/commitment/verifier.rs
Outdated
// = [a] (G + [b * z] U) + [h] H | ||
// except that we wish for the prover to supply G as Commit(g(X); 1) so | ||
// we must substitute to get | ||
// = [a] ((G - H) + [b * z] U) + [h] H |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand how this is a substitution into the previous line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The prover will not be giving us G
but rather G + H
(the blinding factor is 1), so we have to subtract H
from what they give us. We do this because auxillary commitments always have a blinding factor of 1 in the PLONK API, ensuring that under no circumstance will points at infinity emerge inside of a recursive circuit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reviewed everything except the first commit from #113.
@@ -122,6 +122,10 @@ pub trait CurveAffine: | |||
/// The base field over which this elliptic curve is constructed. | |||
type Base: FieldExt; | |||
|
|||
/// Personalization of BLAKE2b hasher used to generate the uniform | |||
/// random string. | |||
const BLAKE2B_PERSONALIZATION: &'static [u8; 16]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I like this being a property of the curve implementation, but it will do for now. We will likely need to refactor this later.
new_curve_impl!(Ep, EpAffine, Fp, Fq, b"halo2_____pallas"); | ||
new_curve_impl!(Eq, EqAffine, Fq, Fp, b"halo2______vesta"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These definitely will need changing to be more descriptive as to their purpose 🙂
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will change again when we switch to simplified SWU for hash-to-curve.
Co-authored-by: Jack Grigg <jack@electriccoin.co>
These will be updated or restored in #111.
[ECC chip] Fixed- and variable-base scalar multiplication
…ng key into uncompressed Montgomery form (zcash#111) * feat: read `VerifyingKey` and `ProvingKey` does not require `params` as long as we serialize `params.k()` * feat: add features "serde-raw" and "raw-unchecked" to serialize/deserialize KZG params, verifying key, and proving key directly into raw bytes in internal memory format. So field elements are stored in Montgomery form `a * R (mod p)` and curve points are stored without compression. * chore: switch to halo2curves 0.3.1 tag * feat: add enum `SerdeFormat` for user to select serialization/deserialization format of curve and field elements Co-authored-by: Jonathan Wang <jonathanpwang@users.noreply.github.com>
Closes #66.
The goal of this PR is to solve multiple problems:
Proof
objects scattered around the codebase for each of the different subprotocols. Some of these proof objects contain vectors of (vectors of) curve points and scalars. This complicates serialization/deserialization of proofs because the lengths of these vectors depend on the configuration of the circuit. We don't want to encode the lengths of vectors inside of proofs because we'll know them at runtime and they do not change. We should instead treat proof objects as opaque byte streams into the verifier.Proof
object that isn't also placed in the transcript.The idea is to refactor the
transcript
module to provide two separate concepts: aTranscriptWrite
trait that represents something we can write to (at proving time) and aTranscriptRead
trait that represents something we can read from (at verifying time). Crucially, implementations of these traits are responsible for simultaneously writing to somestd::io::Write
buffer at the same time that they hash things into the transcript, and similarly forTranscriptRead/std::io::Read
.We can then remove all of the
Proof
-like structures from the codebase.In addition, we perform a few refactorings of the PLONK implementation to reflect the fact that the verifier needs to temporarily store values in the proof and cannot simply access a
Proof
object anymore. This makes the verifier look a lot more like the prover, which does the same thing for other reasons.In order to make this easier I had to also change the commitment scheme to more closely match the one from the accumulation scheme paper. That involved three changes:
[v] G_0
from the commitmentP
that we're opening atx
; the argument from then forward becomes "does the polynomial committed atP
have a root atx
" which will be efficient inside of recursive circuits because it's just a fixed-based scalar multiplication, versus the variable baseP' = P + [v] U
thing that we do in the Halo paper.R
to a random polynomial that also has a root atx
. The verifier samples randomiota
and computesP' = P - [v] G_0 + [iota] R
and the inner product argument is run onP'
. This makes it so that we don't need to do a Schnorr proof at the end becauseP'
is uniformly distributed in the space of polynomials with a root atx
.U
for checking the values; we use a fixedU
and compute a challengez
to computeU' = [z] U
and useU'
for that purpose. It was never really necessary forU
to be sampled from the group, which I didn't realize at the time.