Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Squareness vs oddness tie-breaker for public keys #191

Closed
real-or-random opened this issue Jan 28, 2020 · 8 comments
Closed

Squareness vs oddness tie-breaker for public keys #191

real-or-random opened this issue Jan 28, 2020 · 8 comments

Comments

@real-or-random
Copy link

@real-or-random real-or-random commented Jan 28, 2020

Splitting discussions out. This one about the performance impact on signing, without precomputed public key data.

  • With squareness tie-breaker, signing needs to compute (privkey*G), compute its affine X coordinate (=computing inv(Z), squaring it, and multiplying with X), compute the squaredness of its Y coordinate (=computing Y*Z, and jacobi of it), normalize X, and then conditionally negate the privkey.

  • With oddness tie-breaker, signing needs to compute (privkey*G), compute its affine X coordinate (=computing inv(Z), squaring it, and multiplying with X), compute the affine Y coordinate (=computing 1/Z^3, and multiplying with Y), normalize X, normalize Y, and then conditionally negate the privkey.

The current PR does one unnecessary multiplication, as it's unnecessarily computing the affine Y coordinate. After fixing that, the speed gain from going from square to even tiebreaker is trading 1 field multiplication and normalization for 1 jacobi computation - which is dominated by the jacobi symbol.

I don't know if we care about this ~5us speedup at signing time (because when you do care, perhaps you want signing with precomputed public key data, which avoids this cost in both cases), but there seems to be basically no gain apart from consistency with R (which is arguably very different, as handling of R is generally limited in scope to the signing code, while public keys are obviously exposed to applications (and things like the taproot tweak break the perfect blackboxy ness of them).

Originally posted by @sipa in bitcoin-core/secp256k1#558

@real-or-random

This comment has been minimized.

Copy link
Author

@real-or-random real-or-random commented Jan 28, 2020

I don't know if we care about this ~5us speedup at signing time (because when you do care, perhaps you want signing with precomputed public key data, which avoids this cost in both cases), but there seems to be basically no gain apart from consistency with R (which is arguably very different, as handling of R is generally limited in scope to the signing code, while public keys are obviously exposed to applications (and things like the taproot tweak break the perfect blackboxy ness of them).

I think the question is not so much whether we care. We care about the details, and here it seems to me that switching to oddness is faster and it's basically for free because there is no point in consistency.

  • As you point out, handling R is fully internally anyway.
  • I would care about consistency if it was to avoid implementing two algorithms. But in this case, the oddnest test is a one-line algorithm.
  • I think then consistency with existing keys is actually the nicer consistency because -- as you point out -- public keys (and their tie-breaker to some extent) are user-facing and sticking to the thing that people and that is mathematically simpler is probably better for implementers.
@sipa

This comment has been minimized.

Copy link
Owner

@sipa sipa commented Jan 28, 2020

@roconnor-blockstream pointed on IRC that the even-tiebreaker also adds a normalization step to the verifier.

@gmaxwell

This comment has been minimized.

Copy link

@gmaxwell gmaxwell commented Jan 28, 2020

careful with too much public key consistency. That's how you people blowing their security by using the same key/nonce with ecdsa and schnorr, or burning their coins by writing uncompressed pubkeys into the output. :P

I don't think this one matters too much one way or another. It's nice if the flagged public key can just be an ordinary 'compressed' public key.

Looks like 'oddness' (yuck) for the pubkey would just plain be faster because jacobi is slower than a multiply, unlike for r and we get the square and inverse for free as a product of the x projection to affine. Even if you assume that the pubkey is going to be precomputed in signing (as it should be), that precomputation would still be faster.

@sipa and half a negate.

@sipa

This comment has been minimized.

Copy link
Owner

@sipa sipa commented Jan 28, 2020

So, I think we should change BIP340 to use implicit-even as tiebreaker. To avoid inconsistently breaking potential code people may have written already, we should probably also change the challenge hash tag. How about "Schnorr/secp256k1" or so?

@sipa

This comment has been minimized.

Copy link
Owner

@sipa sipa commented Jan 28, 2020

This also means changing BIP341 to change the flag in the control block to be the odd/even bit of Q, rather than its squaredness.

Implementation-wise, I wonder if this means we can't just drop a bunch of the x-only code, and just use the existing pubkey tweaking functions in the consensus verifier.

@jonasnick

This comment has been minimized.

Copy link

@jonasnick jonasnick commented Jan 28, 2020

I share the slight preference for implicit-even as tiebreaker because it simplifies the scheme (by avoiding the jacobi symbol), despite the additional verification branch. Also, implementation-wise this makes the full/auxiliary pubkey type identical to the existing one and there's no need for a tiebreaker flag.

As for compatibility between auxiliary/flagged and compressed pubkeys, it seems like compressed pubkeys are already compatible with xonly pubkeys. An adversary can easily convert between the two. At least I'm not able to come up with a non-adverserial scenario or practical attack where this matters.

@ajtowns

This comment has been minimized.

Copy link

@ajtowns ajtowns commented Jan 29, 2020

I think the only reason squareness was chosen over evenness was because we'd already specced has_square_y(P) and P = lift_x(x) conversion functions for dealing with R. Switching to evenness sounds like a good tradeoff if it makes implementation/use a bit easier.

@sipa

This comment has been minimized.

Copy link
Owner

@sipa sipa commented Feb 2, 2020

Fixed by #192.

@sipa sipa closed this Feb 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
5 participants
You can’t perform that action at this time.