You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As discovered while implementing tuple streaming support in ephemeral (carbynestack/ephemeral#27), tuples are reserved, and thus consumed, twice under conditions that are not further analyzed.
The computation returns a new amphora secret (✔) and tuples are consumed for each of the threads (here 2000 gfp multiplication triples in total) in ephemeral (❌).
Observation
Ephemeral returned the expected result
Castor lists only 1000 gfp multiplication triples consumed:
2022-07-28T05:41:53.037Z DEBUG io/tuple_streamer.go:230 Fetched new tuples from Castor {"gameID": "da04b169-b95b-462b-b7fe-8cadb89d6e66", "TupleType": {"Name":"MULTIPLICATION_TRIPLE_GFP","PreprocessingName":"Triples","SpdzProtocol":{"Descriptor":"SPDZ gfp","Shorthand":"p"}}, "ThreadNr": 1, "RequestID": "fc73125d-6d77-3fe1-8c75-2198a1e17c3d"}
2022-07-28T05:41:53.112Z DEBUG io/tuple_streamer.go:230 Fetched new tuples from Castor {"gameID": "da04b169-b95b-462b-b7fe-8cadb89d6e66", "TupleType": {"Name":"MULTIPLICATION_TRIPLE_GFP","PreprocessingName":"Triples","SpdzProtocol":{"Descriptor":"SPDZ gfp","Shorthand":"p"}}, "ThreadNr": 2, "RequestID": "1f17caa0-6b61-357e-8a4a-e25caa209d47"}
castor
[...]
2022-07-28 05:41:52.737 DEBUG 1 --- [io-10100-exec-7] i.c.c.s.p.t.MinioTupleStore : Starting download from S3 for key 5e8c28ae-0054-4e31-a23c-8327f01d8b15 from byte 193920 to byte 289920
2022-07-28 05:41:52.738 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.r.ReservationRestController : Received update for reservation #1f17caa0-6b61-357e-8a4a-e25caa209d47_multiplicationtriple_gfp to status UNLOCKED
2022-07-28 05:41:52.739 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.p.c.ReservationCachingService : updating reservation 1f17caa0-6b61-357e-8a4a-e25caa209d47_multiplicationtriple_gfp
2022-07-28 05:41:52.740 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.p.c.ReservationCachingService : object in cache at castor-reservation-store::1f17caa0-6b61-357e-8a4a-e25caa209d47_multiplicationtriple_gfp is Reservation(reservationId=1f17caa0-6b61-357e-8a4a-e25caa209d47_multiplicationtriple_gfp, tupleType=multiplicationtriple_gfp, reservations=[ReservationElement(tupleChunkId=5e8c28ae-0054-4e31-a23c-8327f01d8b15, reservedTuples=1000, startIndex=2020)], status=LOCKED)
2022-07-28 05:41:52.741 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.p.c.ReservationCachingService : reservation updated
2022-07-28 05:41:52.768 DEBUG 1 --- [o-10100-exec-10] i.c.c.s.p.t.MinioTupleStore : Starting download from S3 for key 5e8c28ae-0054-4e31-a23c-8327f01d8b15 from byte 193920 to byte 289920
2022-07-28 05:45:19.804 DEBUG 1 --- [ool-2-thread-22] i.c.c.s.d.WaitForReservationCallable : No reservation was found for id 7b2b3571-bc1e-4de4-a603-67a23b6fa219_inputmask_gfp.
2022-07-28 05:45:19.859 DEBUG 1 --- [ool-2-thread-22] i.c.c.s.d.WaitForReservationCallable : No reservation was found for id 7b2b3571-bc1e-4de4-a603-67a23b6fa219_inputmask_gfp.
2022-07-28 05:45:19.864 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.p.c.ReservationCachingService : persisting reservation Reservation(reservationId=7b2b3571-bc1e-4de4-a603-67a23b6fa219_inputmask_gfp, tupleType=inputmask_gfp, reservations=[ReservationElement(tupleChunkId=c3a4bbd8-7517-43e9-9712-67e436e57854, reservedTuples=20, startIndex=20)], status=LOCKED)
2022-07-28 05:45:19.865 DEBUG 1 --- [io-10100-exec-5] i.c.c.s.p.c.ReservationCachingService : put in database at castor-reservation-store::7b2b3571-bc1e-4de4-a603-67a23b6fa219_inputmask_gfp
[...]
it can be seen, that ephemeral fetches tuples from castor for two different threads using the reservation (request) IDs fc73125d-6d77-3fe1-8c75-2198a1e17c3d and 1f17caa0-6b61-357e-8a4a-e25caa209d47. Both requests are processed by castor independently, but reference the exact same tuples:
carbynestack/carbynestack#40 resolves this issue as an quite old version of castor, with concurrency issue #10 not being resolved, was used. - simple as that 😓
As discovered while implementing tuple streaming support in ephemeral (carbynestack/ephemeral#27), tuples are reserved, and thus consumed, twice under conditions that are not further analyzed.
Reproduce Issue
Tuples available in this scenario were:
In this example, the following two secrets were uploaded
cat test_multi.mpc | cs ephemeral execute ephemeral-generic.default \ -i 0cd12e4c-e104-4846-9b06-d808690dc199 \ -i c716f2f8-53b8-4350-88ac-b46661d2ca76
Expectation
The computation returns a new amphora secret (✔) and tuples are consumed for each of the threads (here 2000 gfp multiplication triples in total) in ephemeral (❌).
Observation
Analyzing the logs from the scenario descibed,
it can be seen, that ephemeral fetches tuples from castor for two different threads using the reservation (request) IDs
fc73125d-6d77-3fe1-8c75-2198a1e17c3d
and1f17caa0-6b61-357e-8a4a-e25caa209d47
. Both requests are processed by castor independently, but reference the exact same tuples:With this, the same tuples are consumed twice and therefore counted only once for consumption.
The text was updated successfully, but these errors were encountered: