Skip to content

nheidloff/hyperparameter-optimization-ibm-watson-studio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hyperparameter Optimization with IBM Watson Studio

This project includes sample code how to train a model with TensorFlow and the Deep Learning service within Watson Studio. The sample shows how to use hyperparameter optimization (HPO) in experiments to easily find the best quality model.

As a starting point TensorFlow For Poets is used to classify images of flowers via transfer learning. Via HPO the number of training steps is optimized.

This is a screenshot of IBM Watson Studio with a training definition and one hyperparamter 'how_many_training_steps' with values between 100 and 2000.

alt text

Prerequisites

Get a free IBM Cloud lite account (no time restriction, no credit card required).

Create an instance of the Machine Learning service. From the credentials get the user name, password and the instance id.

Install the IBM Cloud CLI with the machine learning plugin and set environment variables by following these instructions.

Create an instance of the Cloud Object Storage service and create HMAC credentials by following these instructions. Make sure to use 'Writer' or 'Manager' access and note the aws_access_key_id and aws_secret_access_key for a later step.

Install and configure the AWS CLI by following these instructions.

Training of the Model

Clone this repo:

$ git clone https://github.com/nheidloff/hyperparameter-optimization-ibm-watson-studio.git

Create two buckets (use unique names):

$ aws --endpoint-url=http://s3-api.dal-us-geo.objectstorage.softlayer.net --profile ibm_cos s3 mb s3://nh-flowers-input
$ aws --endpoint-url=http://s3-api.dal-us-geo.objectstorage.softlayer.net --profile ibm_cos s3 mb s3://nh-flowers-output

Download and extract Mobilenet and the images:

$ cd hyperparameter-optimization-ibm-watson-studio/data
$ wget http://download.tensorflow.org/models/mobilenet_v1_2018_02_22/mobilenet_v1_0.25_224.tgz
$ tar xvzf mobilenet_v1_0.25_224.tgz 
$ curl http://download.tensorflow.org/example_images/flower_photos.tgz \
    | tar xz -C tf_files

Upload bucket with MobileNet and data (use your unique bucket name):

$ cd xxx/hyperparameter-optimization-ibm-watson-studio/data
$ aws --endpoint-url=http://s3-api.dal-us-geo.objectstorage.softlayer.net --profile ibm_cos s3 cp . s3://nh-flowers-input/ --recursive 

Define the experiment, the training definition and the hyperparamter:

Open IBM Watson Studio and create a new project (choose the 'Complete' option).

From the 'Assests' tab create a new experiment as done in this screenshot.

Create a training definition as done in this screenshot. You can copy and paste the command from tf-train.yaml.

Define the hyperparameter 'how_many_training_steps' as done in this screenshot.

Run the experiment with the training runs.

Download the saved models, the logs and the results.

$ cd xxx/hyperparameter-optimization-ibm-watson-studio/output
$ aws --endpoint-url=http://s3-api.dal-us-geo.objectstorage.softlayer.net --profile ibm_cos s3 sync s3://nh-flowers-output .

Next Steps

To learn more about HPO check out the documentation. As alternative to the web interface, experiments can also be used from Python notebooks and via CLIs.

About

Hyperparameter Optimization with IBM Watson Studio

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published