How to get keyset representation that is compatible with BigQuery? #373
Comments
There is also So I thought maybe |
I never tried this, but most likely you should use BinaryKeysetWriter but no base64 encoding (just the binary encoding). |
I am only base64 encoding it to be able to copy it and paste it in this issue and in bigquery. When loading the value in bigquery I do parse it with |
Ah ok, I see now. The code you have creates a new Aead, and encrypts the keyset with this. You will not be able to read it like this. Instead, you should write it using CleartextKeysetHandle. |
Yeah I just tried doing this, and it actually seems to work. I can't say I understand it, but after many hours of attempts I will gladly accept it as a solution :) out = io.BytesIO()
writer = tink.BinaryKeysetWriter(out)
cleartext_keyset_handle.write(writer, keyset_handle)
out.seek(0)
print(base64.b64encode(out.read())) |
The BigQuery documentation says this:
How can I, in my python application, create a keyset that is compatible?
I have tried the following:
Which produces the following output.
Neither of these seem to be compatible with BigQuery.
For reference, the BigQuery
KEYS.NEW_KEYSET('AEAD_AES_GCM_256')
function returns something that looks like this:This value has a different length, and if I try to use the values from my Python code in BigQuery, it just gives an error like this:
AEAD.DECRYPT_STRING failed: Keyset deserialization failed: Error reading keyset data: Could not parse the input stream as a Keyset-proto.
I understand that my current code is wrong. But what would be the correct way to do this?
The text was updated successfully, but these errors were encountered: