Skip to content
Branch: master
Find file History
jkr26 and tensorflower-gardener Rename `tf.train.GradientDescentOptimizer` to
`tf.compat.v1.train.GradientDescentOptimizer`.

PiperOrigin-RevId: 253804687
Latest commit cc8696d Jun 19, 2019
Permalink
Type Name Latest commit message Commit time
..
Failed to load latest commit information.
BUILD Mark py_binary's as Python3 May 30, 2019
README.md Scripts for recreating paper simulation results May 23, 2019
__init__.py Add placeholder for sources for "Semi-Cyclic Stochastic Gradient Desc… May 13, 2019
cyclic_bag_log_reg.py Rename `tf.train.GradientDescentOptimizer` to Jun 18, 2019
logs_analysis.ipynb
preprocess_sentiment140.py Fix Python compatibility issues. May 24, 2019
run_experiment.sh Scripts for recreating paper simulation results May 23, 2019
sentiment_util.py Scripts for recreating paper simulation results May 23, 2019

README.md

This directory contains source code for reproducting the results in the ICML publication Eichner, Koren, McMahan, Srebro, Talwar. Semi-Cyclic Stochastic Gradient Descent. While these experiments are highly relevant to the federated learning setting, we experiment with standard SGD (without partitioning the data among users), and so these experiments use vanilla TF rather than TFF.

Running the experiments

The experiments can be run with run.sh, which will

  • download the Sentiment140 dataset
  • preprocess it
  • train + evaluate models for the parameter settings used in the paper.
    • This step will take several hours on a single machine.
    • Each run persists its configuration and results to a separate log file

Analyzing the results

logs_analysis.ipynb contains a Jupyter note book that parses the log files, analyzes the results, and plots them. This notebook was used to produce the figures shown in the publication.

You can’t perform that action at this time.