Skip to content
Experiments for "Communication-Efficient Distributed Optimization in Networks with Gradient Tracking"
Python
Branch: master
Clone or download
Latest commit 3516d36 Jan 13, 2020
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
optimizers AISTATS 2020 Jan 12, 2020
problems AISTATS 2020 Jan 12, 2020
.gitignore Init commit Sep 20, 2019
README.md Init commit Sep 20, 2019
__init__.py AISTATS 2020 Jan 12, 2020
exp_dane_svrg_topology.py AISTATS 2020 Jan 12, 2020
exp_extra_comm.py AISTATS 2020 Jan 12, 2020
exp_linear_regression.py AISTATS 2020 Jan 12, 2020
exp_logistic_regression.py AISTATS 2020 Jan 12, 2020
exp_network_connectivity.py AISTATS 2020 Jan 12, 2020
exp_nn.py AISTATS 2020 Jan 12, 2020
exp_svrg_iter_grads.py Init commit Sep 20, 2019
requirements.txt AISTATS 2020 Jan 12, 2020
utils.py AISTATS 2020 Jan 12, 2020

README.md

Network-Distributed Algorithm Experiments

This repository contains all code needed to reproduce experiments in "Communication-Efficient Distributed Optimization in Networks with Gradient Tracking" [PDF].

Due to the random data generation procedure, resulting graphs may be different from those appeared in the paper, but conclusions remain the same.

If you find this code useful, please cite our paper:

@article{li2019communication,
  title={Communication-Efficient Distributed Optimization in Networks with Gradient Tracking},
  author={Li, Boyue and Cen, Shicong and Chen, Yuxin and Chi, Yuejie},
  journal={arXiv preprint arXiv:1909.05844},
  year={2019}
}

Requirements

  • Python 3.6
  • Required packages are list in requirements.txt.

Experiments

  • Linear regression (Fig. 1): file exp_linear_regression.py
  • Logistic regression (Fig. 2): file exp_logistic_regression.py
  • Computation-communication trade-off for Network-SVRG (Fig. 3): file exp_svrg_iter_grads.py
  • Network topology (Fig. 4): file exp_dane_svrg_topology.py
  • Neural networks (Fig. 5): file exp_nn.py
You can’t perform that action at this time.