Skip to content
master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 

Choco-SGD

This repository provides code for communication-efficient decentralized ML training (both deep learning, compatible with PyTorch, and traditional convex machine learning models.

We provide code for the main experiments in the papers

Please refer to the folders convex_code and dl_code for more details.

References

If you use the code, please cite the following papers:

@inproceedings{koloskova2019choco,
    title = {Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication},
    author = {Anastasia Koloskova and Sebastian U. Stich and Martin Jaggi},
    booktitle = {ICML 2019 - Proceedings of the 36th International Conference on Machine Learning},
    url = {http://proceedings.mlr.press/v97/koloskova19a.html},
    publisher = {PMLR}, 
    volume = {97},
    pages = {3479--3487},
    year = {2019}
}

and

@inproceedings{koloskova2020decentralized,
  title={Decentralized Deep Learning with Arbitrary Communication Compression},
  author={Anastasia Koloskova* and Tao Lin* and Sebastian U Stich and Martin Jaggi},
  booktitle={ICLR 2020 - International Conference on Learning Representations},
  year={2020},
  url={https://openreview.net/forum?id=SkgGCkrKvH}
}

About

Decentralized SGD and Consensus with Communication Compression: https://arxiv.org/abs/1907.09356

Resources

License

Releases

No releases published

Packages

No packages published