We kindly thank (in random order)
- Artem Babenko and Vladimir Aliev for helpful discussions and editorial review of the paper,
- Jacob R. Steeves for discussions on RPC frameworks and NAT traversal and peer-to-peer technologies.
- Dmitry Afanasiev for his guidance on networking and communication technologies,
- Lidi Zheng and grpc-aio contributors for their awesome framework and this PR
- Brian Muller for his implementations of kademlia and rpcudp
- Alexander Sherbakov for helpful discussions on PC and server component architecture,
- Our early adopters, contributors, and reviewers
We also want to reference several projects that have similar ideas in mind:
- BitTensor — a decentralized deep learning ecosystem with incentive mechanism. Like hivemind, but peers are getting rewarded for their contribution to other peers. .
- GShard — a paper by Dmitry Lepikhin et al. that demonstrate the effectiveness of huge Mixture-of-Experts models on conventional hpc hardware. Those guys train models 4 times the size of GPT-3 on thousands of TPUv3.
- Also doing research in decentralized deep learning? Let us know!