Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

using smaller key sizes and adjusting the calculation #133

Closed
amlwwalker opened this issue Jul 15, 2022 · 2 comments
Closed

using smaller key sizes and adjusting the calculation #133

amlwwalker opened this issue Jul 15, 2022 · 2 comments

Comments

@amlwwalker
Copy link

amlwwalker commented Jul 15, 2022

hey so I am very close to something that now works fully which I'm really pleased with. To be honest, most of the work has been converting 2D arrays to 1D arrays.
The one last thing I'm experiencing is:

    console.log("kernels length", kernels.length)
    for (let i = 0, l = arraysToSend.length; i < l; i++) {
      const image3232Chunk = seal.CipherText()
      image3232Chunk.load(context, arraysToSend[i]) //load the current image chunk
      const encryptedFillArray = seal.CipherText()
      encryptedFillArray.load(context, savedFillArray) //encrypted load of zeros to store the result
      for (let j = 0, end = kernels.length; j < end; j++) {
          const tmp = evaluator.multiplyPlain(image3232Chunk, kernels[j])
          evaluator.add(encryptedFillArray, tmp, encryptedFillArray)
      }
      const send = encryptedFillArray.save()
      convEncArray.push(send)
    }

In this case the first line prints: kernels length 114
If I use

const heConfig = {
    keySize: 4096,
    bitSize: 20
}

then I get the correct value after I decrypt. However 2048 or 1024 gives me garbage out. My thinking is my multiplyPlain and add are resulting in too much entropy in the data and its causing garbage when I decrypt. My initial thinking is I am applying too many calculations to the encryptedFillArray (my final resulting array for each iteration) and its becoming a load of mess when the keySize is too small - I think tmp will not be the problem here.
What options do I have to allow myself to use smaller key sizes?
my first guess is relinearize but I think the galois keys are very big and I'd rather not send one over the wire, so then I thought could I create an array of encryptFillArray[...]s and then add them back together afterwards or something, rather than apply the maths on the same one again and again?

The reason I want to use smaller key sizes is because I want to send the base64 exported cipher texts (.save()) over HTTP and smaller key sizes means smaller amounts of data to send.

also...

UJNRELATED:
Final and unrelated question, I'm actually trying to calculate the average - I need to end up with the multiplication of each value, then sum them and divide by the number of values in kernels[x], but I am not sure how to do that in a 1D array - i.e choosing the correct elements (only a subset of the values in the cipherText array) and secondly I am not sure if I can do division at all (even multiply by inverse....?) so that I end up with 1 value rather than kernels[j].length values (i'm doing gaussian blur on the data which is an average. I may have to do this part after decryption I suspect...?

Thanks!

@amlwwalker
Copy link
Author

also - what best way to generate the public key, from a secret key?

@s0l0ist
Copy link
Owner

s0l0ist commented Jul 20, 2022

You could try lowering the bitSize in:

const heConfig = {
    keySize: 4096,
    bitSize: 20 // lower to 17 or 16 when using PolyModulusDegree < 4096
}

I'm not sure what the contents of your ciphers contain or why you're using multiple ciphertexts (not that it's incorrect to do so, but changes the context of what you can do). If you're using batching (storing PolyModulusDegree number of values in each of the ciphertexts - i.e. 4096 values when using polyModDegree of 4096), there is a sumElements function that will sum all items in a single ciphertext. It does require you to generate GaloisKeys which are very large, but you can tune them to reduce the size. Check out the discussion over in #129. After retrieving the sum, you could either 'divide' by multiply by the inverse, but that really only works if you're using CKKS.

Regarding public key generation, you should always create the publicKey from the KeyGenerator that is used to generate the SecretKey. If instead, you want to create a PublicKey from a previously generated SecretKey, you can pass in the secret key instance when constructing the KeyGenerator instance. Any PublicKeys generated will work with that SecretKey used to create the KeyGenerator instance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants