Skip to content
This repository has been archived by the owner on Jul 24, 2023. It is now read-only.

why subtract one in integer_chip #1

Open
JohnWick2ETH opened this issue May 15, 2022 · 2 comments
Open

why subtract one in integer_chip #1

JohnWick2ETH opened this issue May 15, 2022 · 2 comments

Comments

@JohnWick2ETH
Copy link

JohnWick2ETH commented May 15, 2022

why subtract one to compute v1 in the integer_chip.add_constraints_for_mul_equation_on_limb0()?

AFAIK, v1*2^2b = u1 + v0.

let u1 = v0 - one + limbs[2].value - rem.limbs_le[2].value

@lanbones
Copy link
Contributor

lanbones commented May 16, 2022

To avoid minus overflow when computing u0, we borrowed one from v1.
Think about
limbs[1].value < rem.limbs_le[1].value || (limbs[1].value == rem.limbs_le[1].value && limbs[0].value < rem.limbs_le[0].value)
see

+ self.helper.limb_modulus_exps[2];
,

@JohnWick2ETH
Copy link
Author

why would it happen that u0 is negative?

@29988122 29988122 reopened this Mar 29, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants