Skip to content

Batch scoring deep learning models with AML

License

Notifications You must be signed in to change notification settings

taocao/az-deep-batch-score

 
 

Repository files navigation

Build Status

Batch Scoring Deep Learning Models With Azure Machine Learning

Overview

As described in the associated page on the Azure Reference Architecture center, in this repository, we use the scenario of applying style transfer onto a video (collection of images). This architecture can be generalized for any batch scoring with deep learning scenario. An alternative solution using Azure Kubernetes Service can be found here.

Design

Reference Architecture Diagram

The above architecture works as follows:

  1. Upload a video file to storage.
  2. The video file will trigger Logic App to send a request to the AML pipeline published endpoint.
  3. The pipeline will then process the video, apply style transfer with MPI, and postprocess the video.
  4. The output will be saved back to blob storage once the pipeline is completed.

What is Neural Style Transfer

Style image: Input/content video: Output video:
click to view video click to view

Prerequsites

Local/Working Machine:

Accounts:

While it is not required, it is also useful to use the Azure Storage Explorer to inspect your storage account.

Setup

  1. Clone the repo git clone https://github.com/Azure/Batch-Scoring-Deep-Learning-Models-With-AML
  2. cd into the repo
  3. Setup your conda env using the environment.yaml file conda env create -f environment.yml - this will create a conda environment called batchscoringdl_aml
  4. Activate your environment conda activate batchscoringdl_aml
  5. Log in to Azure using the az cli az login

Steps

Run throught the following notebooks:

  1. Test the scripts
  2. Setup AML.
  3. Develop & publish AML pipeline
  4. Deploy Logic Apps
  5. Clean up

Clean up

To clean up your working directory, you can run the clean_up.sh script that comes with this repo. This will remove all temporary directories that were generated as well as any configuration (such as Dockerfiles) that were created during the tutorials. This script will not remove the .env file.

To clean up your Azure resources, you can simply delete the resource group that all your resources were deployed into. This can be done in the az cli using the command az group delete --name <name-of-your-resource-group>, or in the portal. If you want to keep certain resources, you can also use the az cli or the Azure portal to cherry pick the ones you want to deprovision. Finally, you should also delete the service principle using the az ad sp delete command.

All the step above are covered in the final notebook.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Related projects

Microsoft AI Github Find other Best Practice projects, and Azure AI Designed patterns in our central repository.

About

Batch scoring deep learning models with AML

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 83.1%
  • Python 14.8%
  • Makefile 1.9%
  • Shell 0.2%