New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
challenger.create_challenge()
doesn't need the full key, just the public key
#1
Comments
Hey, Riad. Great work on this. It's really impressive.
Absolutely. This was just the initial port to make sure everything works (and also for me to understand the scheme better). I have an implementation as well as a binding to openssl for RSA (which supports standard OAEP). So I'm planning to change things around a lot, including changing the API regarding RSA keys, etc. I have a general rule that whenever I implement a crypto algorithm in JS, I always try to have an accompanying C implementation for speed (JS is essentially treated as fallback on platforms with no native build support). So, figuring out how to distill this down into a smaller C implementation (just for verification at least) should be more fun.
Thanks for contributing so much. Looking forward to it too. |
Thanks for the kind words :) I've now pushed the quick touches to libGooPy to clarify the above stuff. Speaking of C: agreed, the C implementation should be fun! I had been thinking about porting to C++, but I haven't gotten around to it yet. I'm not sure what your plans are, but just in case you're thinking about using GMP, there's a slightly obscure twist that I may as well mention. (This is a bit out of left field, and probably I'm telling you something you already know, but I figured better to mention than not.) Since V relies on P to provide the prime The reason C/GMP brought this to mind is that, as that paper indicates, GMP doesn't implement a primality test that's sound against adversarially generated primes---so unfortunately (From my benchmarking, we should be able to get Strong Lucas within ~3-4x of Rabin-Miller. Though the GMP folks have optimized the heck out of their stuff, so it might take some effort to get that close to their code!) |
Funny you should mention that: I was planning on using gmp. In fact, I was planning on rewriting my whole bcrypto library with a libnettle backend (which includes a stripped down version of gmp, but presumably with primality tests somewhere since it can generate RSA keys). I hadn't taken the time to look into which primality tests it ships with though. That's good to know. It's probably better to implement our own thing anyway. When we're dealing with consensus code it's best to have complete control over everything.
Hmm, Dan sounded confident that just a few rounds of miller-rabin would be sufficient. Possible I misunderstood him though. Primality testing is definitely weird to have on a blockchain. I'm worried it can be a DoS vector in a number of ways. I figure we'll have to think about this more anyway. There's probably more gotchas (e.g. the RNG used by the miller-rabin function should be determistic, lest we run into a consensus fault).
Very interesting. Will check out. |
Also, as an aside... to your credit again, I think porting your code actually uncovered a bug in v8 (you may recognize your ClassGroupOps#reduce function in the code snippet): https://bugs.chromium.org/p/v8/issues/detail?id=8380 |
Ah, yes! I spoke with Dan about this. My understanding from our conversation is that a pseudoprime might be sufficient, but we will have to do some work to convince ourselves of this fact. As a practical matter, combining R-M with a Lucas test might actually be faster than running R-M by itself, because you get to trade a few tens of R-M iterations for one or two Lucas iterations. Specifically, for a randomly chosen composite, you need about 64 iterations of R-M to get 2^-128-ish certainty of primality. In contrast, 16 R-M iterations and two Strong Lucas tests should be ~3x faster and should give more certainty, especially for adversarially-chosen prime candidates. (In fact, there's good reason to believe that no composite on the order of 128 bits will pass both R-M and Strong Lucas. It would be astonishing---and probably a publishable result in number theory---if anyone managed to find one.) As to derandomizing the primality test: this is a good point, and I am sure there's a lot I don't understand about this setting, but I suspect it shouldn't be necessary. Both R-M and Strong Lucas have only one-sided error: they will never incorrectly report that a prime number is composite. |
woof, everyone always tells me I'm a troublemaker... :) (but obviously, this bug is to your credit. Nicely spotted.) |
Ah, I see. That sounds good to me then. I wrote a quick and dirty C implementation with gmp: https://github.com/handshake-org/goosig/blob/master/src/goo/goo.c -- it tries to avoid allocations as much as possible by keeping allocated integers in memory. Pretty much everything that was seen as temporary in the python/js impls is preallocated now. It appears to be 20x faster than the JS version and 12x faster than the python version for verification. Verification times on average are 1.3ms (that's using gmp's default primality test). Hopefully we can get that below 1ms by switching to a new primality test (instead of mpz_probab_prime_p).
Unknown unknowns. Better safe than sorry. |
Very cool! Sounds like it's well positioned to be sub-1ms. I don't have time in the next couple days, but as long as it's all right with you I'd be happy to give a quick read over the code. I'm sure I'll learn a few tricks 👍 (I'm also happy to contribute some code. For example, I've got a GMP-based Tonelli-Shanks implementation that's reasonably well tested, and I'm also happy to implement strong Lucas, etc.)
Totally understood. But as I'm sure you know, if done wrong there is significant danger in making the primality testing deterministic, because it could undermine the soundness of the proof system. Specifically, it may allow a signer to craft a proof with composite But we're using Fiat-Shamir already, so there's an easy way out (and I'm guessing this is what you were already planning to do): V should just seed a PRNG with a hash of the parameters, the public key, the message, and the signature, then use that PRNG to generate whatever randomness the primality tests need. (In fact, since V will already have seeded a PRNG to generate |
Of course! I think it really needs some review. I've been working on it for the past few days now. The C implementation now supports challenging and signing (benchmarks). I've also generated some test vectors. It'd be nice to make our implementations compatible so we can verify everything is working correctly. There's only
Anyway, aside from perf, I think all we need now is to figure out the primality tests.
Mmmm, they could grind a composite that passes a deterministic miller-rabin? How difficult would that be in terms of computation? If we use both MR and Lucas, wouldn't they have to grind one which passes Lucas as well? Personally, I would rather have a chance of theft than a chance of consensus faults (depends on how high the chance of theft is though).
If we really want to mitigate this kind of attack we could seed the PRNG with a recent block hash (that's maybe outside the scope of this protocol on its own), although that may cause weird issues with reorgs. There's also an off chance that a proof in the mempool may be temporarily invalidated until a later block. I think that's an ugly fix. Wouldn't seeding with the parameters still lend itself to ground composites? |
Sorry, busy last few days. I'll have more time to go over things on Friday, but in the meantime, a few very quick comments:
Sorry for the scattershot responses. Like I said above, I'll have more time to think about this on Friday. |
Thanks for the response. That paper is interesting. I wasn't aware primality tests were so easily game-able. It also seemed to be singing the praises of golang's Baillie-PSW implementation, so I ported it to C and javascript almost line-for-line. The primality test I've implemented is now basically what golang's crypto module uses to generate RSA keys. Right now, it's 16 rounds of deterministically seeded miller-rabin (plus another one for golang's Adding both of these hurt perf a little bit (now seeing ~1.45-1.5ms average verification time), but it's not too bad. We can probably find other ways to optimize. Anyway, I'm pretty happy with the library right now. I want to fuzz it more, but I think it's getting there.
Consensus faults are an issue if a primality test can incorrectly say a composite is prime due to the result of a non-deterministic RNG. Nodes could (at least temporarily) accept an invalid block. More concerning than an invalid block would be an invalid TX in a miner's mempool -- they may unknowingly mine an invalid transaction and cost themselves a bunch of money in electricity.
I agree. I really don't like dealing with block hashes. Every time I've tried to incorporate the use of a recent blockhash in a consensus rule I always regret it (Handshake currently has one mechanism that absolutely needs this, unfortunately). It makes reorgs and mempool maintenance nonsensical. I don't plan on going that route. |
Really, this is probably super overkill :) We should chat with Dan about it, but I'm confident that we can at least reduce the number of random base R-M iterations to something in the 3 to 7 range (and keep the base-2 R-M and the Lucas tests). Even that's probably still overkill.
Ah, finally I see the issue: it's that a malicious signer might cause a consensus fault by generating a proof that a nontrivial fraction of verifiers will reject. Sorry for being thick here. |
That sounds good to me.
Yeah, it's sort of the inverse of what you were considering. I probably wasn't explaining it well. |
Also an update, I've ported all the tests to C: https://github.com/handshake-org/goosig/blob/master/src/goo/goo.c#L2794 (the higher level operations get tested in the JS test suite, but I wanted lower level unit tests for other functions). All passing. I also added some tests for miller-rabin and lucas. I think once we decide on the finer details of the protocol, we can probably have this code audited by someone. I've also been doing some rudimentary fuzzing (something I want to improve later), nothing of note yet. Seems stable. |
Closing for now. We can move discussion into other issues probably. |
First, thanks for inviting me to collaborate! I look forward to meeting y'all in person, too :)
One quick comment: in libGooPy I played it a bit fast and loose with RSAKey as it pertains to the challenger. The challenger (as you'd expect) only needs the public key, while the signer needs both the public and the private key.
I'll push an update now to libGooPy that makes this clearer. Specifically, I'll add a
get_public_key()
method to theRSAKey
class, and pass the result of that to GooSigTokGen object.Does this make sense?
The text was updated successfully, but these errors were encountered: