Find file History
dizcology TPU training templates cleanup (#335)
* remove # FIXME comment

* add tools for updating argument help strings

* update notebooks

* remove Colab specific code from .py files

* simplified sample config

* review comments
Latest commit 4c8cf68 Jan 14, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
trainer TPU training templates cleanup (#335) Jan 14, 2019 update census data bucket May 11, 2018 Remove contrib dependencies Nov 27, 2018
requirements.txt added upper limit to tensorflow version Jun 22, 2018 2017_08_10_12:08:31 Aug 10, 2017

Predicting income with the Census Income Dataset using Keras

This is the Open Source Keras version of the Census sample. The sample runs both as a standalone Keras code and on Cloud ML Engine.

Download the data

The Census Income Data Set that this sample uses for training is hosted by the UC Irvine Machine Learning Repository. We have hosted the data on Google Cloud Storage in a slightly cleaned form:

  • Training file is
  • Evaluation file is adult.test.csv



Virtual environment

Virtual environments are strongly suggested, but not required. Installing this sample's dependencies in a new virtual environment allows you to run the sample without changing global python packages on your system.

There are two options for the virtual environments:

  • Install Virtual env
    • Create virtual environment virtualenv census_keras
    • Activate env source census_keras/bin/activate
  • Install Miniconda
    • Create conda environment conda create --name census_keras python=2.7
    • Activate env source activate census_keras

Install dependencies

  • Install gcloud
  • Install the python dependencies. pip install --upgrade -r requirements.txt

Using local python

You can run the Keras code locally

python -m trainer.task --train-files $TRAIN_FILE \
                       --eval-files $EVAL_FILE \
                       --job-dir $JOB_DIR \
                       --train-steps $TRAIN_STEPS

Training using gcloud local

You can run Keras training using gcloud locally

gcloud ml-engine local train --package-path trainer \
                             --module-name trainer.task \
                             -- \
                             --train-files $TRAIN_FILE \
                             --eval-files $EVAL_FILE \
                             --job-dir $JOB_DIR \
                             --train-steps $TRAIN_STEPS

Prediction using gcloud local

You can run prediction on the SavedModel created from Keras HDF5 model

python sample.json
gcloud ml-engine local predict --model-dir=$JOB_DIR/export \
                               --json-instances sample.json

Training using Cloud ML Engine

You can train the model on Cloud ML Engine

gcloud ml-engine jobs submit training $JOB_NAME \
                                    --stream-logs \
                                    --runtime-version 1.4 \
                                    --job-dir $JOB_DIR \
                                    --package-path trainer \
                                    --module-name trainer.task \
                                    --region us-central1 \
                                    -- \
                                    --train-files $GCS_TRAIN_FILE \
                                    --eval-files $GCS_EVAL_FILE \
                                    --train-steps $TRAIN_STEPS

Prediction using Cloud ML Engine

You can perform prediction on Cloud ML Engine by following the steps below. Create a model on Cloud ML Engine

gcloud ml-engine models create keras_model --regions us-central1

Export the model binaries


Deploy the model to the prediction service

gcloud ml-engine versions create v1 --model keras_model --origin $MODEL_BINARIES --runtime-version 1.2

Create a processed sample from the data

python sample.json

Run the online prediction

gcloud ml-engine predict --model keras_model --version v1 --json-instances sample.json