You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been inspecting my project with go's race detector and stumbled upon a data race caused by p2p package.
Severity
It seems that the core functionality of the library is not affected by the race, but it makes it a bit harder to inspect apps that use the lib for data races because of a lot of "false positives", so to speak.
How to reproduce
Let's start with the basic garbled example:
Shell 1:
cd apps/garbled
go run --race . -e -i 800000 examples/millionaire.mpcl
Shell 2:
cd apps/garbled
go run --race . -i 750000 examples/millionaire.mpcl
Output in Shell 1:
- In1: a{1,0}i64:int64
+ In2: b{1,0}i64:int64
- Out: %_{0,1}b1:bool1
- In: [800000]
Listening for connections at :8080
New connection from 127.0.0.1:61520
p2p.NewConn
==================
WARNING: DATA RACE
Read at 0x00c00037e0e8 by main goroutine:
github.com/markkurossi/mpc/circuit.Evaluator()
/Users/rozag/workspace/mpc/circuit/evaluator.go:115 +0xf44
main.evaluatorMode()
/Users/rozag/workspace/mpc/apps/garbled/main.go:359 +0x28d
main.main()
/Users/rozag/workspace/mpc/apps/garbled/main.go:311 +0x2ce6
Previous write at 0x00c00037e0e8 by goroutine 14:
github.com/markkurossi/mpc/p2p.(*Conn).writer()
/Users/rozag/workspace/mpc/p2p/protocol.go:94 +0x244
github.com/markkurossi/mpc/p2p.NewConn.func1()
/Users/rozag/workspace/mpc/p2p/protocol.go:80 +0x39
Goroutine 14 (running) created at:
github.com/markkurossi/mpc/p2p.NewConn()
/Users/rozag/workspace/mpc/p2p/protocol.go:80 +0x304
main.evaluatorMode()
/Users/rozag/workspace/mpc/apps/garbled/main.go:358 +0x244
main.main()
/Users/rozag/workspace/mpc/apps/garbled/main.go:311 +0x2ce6
==================
Result[0]: false
Output in Shell 2:
+ In1: a{1,0}i64:int64
- In2: b{1,0}i64:int64
- Out: %_{0,1}b1:bool1
- In: [750000]
p2p.NewConn
==================
WARNING: DATA RACE
Read at 0x00c000424178 by main goroutine:
github.com/markkurossi/mpc/circuit.Garbler()
/Users/rozag/workspace/mpc/circuit/garbler.go:139 +0xb24
main.garblerMode()
/Users/rozag/workspace/mpc/apps/garbled/main.go:381 +0x13b
main.main()
/Users/rozag/workspace/mpc/apps/garbled/main.go:318 +0x2eb5
Previous write at 0x00c000424178 by goroutine 14:
github.com/markkurossi/mpc/p2p.(*Conn).writer()
/Users/rozag/workspace/mpc/p2p/protocol.go:94 +0x244
github.com/markkurossi/mpc/p2p.NewConn.func1()
/Users/rozag/workspace/mpc/p2p/protocol.go:80 +0x39
Goroutine 14 (running) created at:
github.com/markkurossi/mpc/p2p.NewConn()
/Users/rozag/workspace/mpc/p2p/protocol.go:80 +0x304
main.garblerMode()
/Users/rozag/workspace/mpc/apps/garbled/main.go:378 +0xb3
main.main()
/Users/rozag/workspace/mpc/apps/garbled/main.go:318 +0x2eb5
==================
Result[0]: false
Found 1 data race(s)
The root cause
It seems that the problem is that we read and write to p2p.Conn.Stats.(Sent|Recvd) without any synchronization from multiple goroutines. And one of the goroutines is the one we start in p2p.NewConn.
Proposed solution
Leverage sync/atomic.Uint64 instead of raw uint64 in p2p.IOStats.
The text was updated successfully, but these errors were encountered:
Issue
I've been inspecting my project with go's race detector and stumbled upon a data race caused by
p2p
package.Severity
It seems that the core functionality of the library is not affected by the race, but it makes it a bit harder to inspect apps that use the lib for data races because of a lot of "false positives", so to speak.
How to reproduce
Let's start with the basic
garbled
example:The root cause
It seems that the problem is that we read and write to
p2p.Conn.Stats.(Sent|Recvd)
without any synchronization from multiple goroutines. And one of the goroutines is the one we start inp2p.NewConn
.Proposed solution
Leverage
sync/atomic.Uint64
instead of rawuint64
inp2p.IOStats
.The text was updated successfully, but these errors were encountered: