Skip to content

Latest commit

 

History

History
 
 

cli

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
page_type languages products description
sample
azurecli
azure-machine-learning
Top-level directory for official Azure Machine Learning CLI sample code.

Azure Machine Learning CLI (v2) (preview) examples

cleanup code style: black license: MIT

Welcome to the Azure Machine Learning examples repository!

Prerequisites

  1. An Azure subscription. If you don't have an Azure subscription, create a free account before you begin.
  2. A terminal. Install and set up the CLI (v2) before you begin.

Getting started

  1. Install and set up the CLI (v2)

Examples

Scripts

path status
batch-score-rest.sh batch-score-rest
batch-score.sh batch-score
deploy-local-endpoint.sh deploy-local-endpoint
deploy-managed-online-endpoint-access-resource-sai.sh deploy-managed-online-endpoint-access-resource-sai
deploy-managed-online-endpoint-access-resource-uai.sh deploy-managed-online-endpoint-access-resource-uai
deploy-managed-online-endpoint-mlflow.sh deploy-managed-online-endpoint-mlflow
deploy-managed-online-endpoint.sh deploy-managed-online-endpoint
deploy-moe-autoscale.sh deploy-moe-autoscale
deploy-r.sh deploy-r
deploy-rest.sh deploy-rest
deploy-safe-rollout-k8s-online-endpoints.sh deploy-safe-rollout-k8s-online-endpoints
deploy-safe-rollout-online-endpoints.sh deploy-safe-rollout-online-endpoints
deploy-tfserving.sh deploy-tfserving
deploy-torchserve.sh deploy-torchserve
deploy-triton-managed-online-endpoint.sh deploy-triton-managed-online-endpoint
misc.sh misc
mlflow-uri.sh mlflow-uri
train-rest.sh train-rest
train.sh train

Jobs (jobs)

path status description
jobs/pipelines-with-components/nyc_taxi_data_regression/job.yml jobs/pipelines-with-components/nyc_taxi_data_regression/job no description
jobs/pipelines/cifar-10/job.yml jobs/pipelines/cifar-10/job no description
jobs/pipelines/nyc-taxi/job.yml jobs/pipelines/nyc-taxi/job no description
jobs/single-step/dask/nyctaxi/job.yml jobs/single-step/dask/nyctaxi/job This sample shows how to run a distributed DASK job on AzureML. The 24GB NYC Taxi dataset is read in CSV format by a 4 node DASK cluster, processed and then written as job output in parquet format.
jobs/single-step/julia/iris/job.yml jobs/single-step/julia/iris/job Train a Flux model on the Iris dataset using the Julia programming language.
jobs/single-step/lightgbm/iris/job-sweep.yml jobs/single-step/lightgbm/iris/job-sweep Run a hyperparameter sweep job for LightGBM on Iris dataset.
jobs/single-step/lightgbm/iris/job.yml jobs/single-step/lightgbm/iris/job Train a LightGBM model on the Iris dataset.
jobs/single-step/pytorch/cifar-distributed/job.yml jobs/single-step/pytorch/cifar-distributed/job Train a basic convolutional neural network (CNN) with PyTorch on the CIFAR-10 dataset, distributed via PyTorch.
jobs/single-step/pytorch/iris/job.yml jobs/single-step/pytorch/iris/job Train a neural network with PyTorch on the Iris dataset.
jobs/single-step/pytorch/word-language-model/job.yml jobs/single-step/pytorch/word-language-model/job Train a multi-layer RNN (Elman, GRU, or LSTM) on a language modeling task with PyTorch.
jobs/single-step/r/accidents/job.yml jobs/single-step/r/accidents/job Train a GLM using R on the accidents dataset.
jobs/single-step/r/iris/job.yml jobs/single-step/r/iris/job Train an R model on the Iris dataset.
jobs/single-step/scikit-learn/diabetes/job.yml jobs/single-step/scikit-learn/diabetes/job Train a scikit-learn LinearRegression model on the Diabetes dataset.
jobs/single-step/scikit-learn/iris-notebook/job.yml jobs/single-step/scikit-learn/iris-notebook/job Train a scikit-learn SVM on the Iris dataset using a custom Docker container build with a notebook via papermill.
jobs/single-step/scikit-learn/iris/job-docker-context.yml jobs/single-step/scikit-learn/iris/job-docker-context Train a scikit-learn SVM on the Iris dataset using a custom Docker container build.
jobs/single-step/scikit-learn/iris/job-sweep.yml jobs/single-step/scikit-learn/iris/job-sweep Sweep hyperparemeters for training a scikit-learn SVM on the Iris dataset.
jobs/single-step/scikit-learn/iris/job.yml jobs/single-step/scikit-learn/iris/job Train a scikit-learn SVM on the Iris dataset.
jobs/single-step/spark/nyctaxi/job.yml jobs/single-step/spark/nyctaxi/job This sample shows how to run a single node Spark job on Azure ML. The 47GB NYC Taxi dataset is read in parquet format by a 1 node Spark cluster, processed and then written as job output in parquet format.
jobs/single-step/tensorflow/mnist-distributed-horovod/job.yml jobs/single-step/tensorflow/mnist-distributed-horovod/job Train a basic neural network with TensorFlow on the MNIST dataset, distributed via Horovod.
jobs/single-step/tensorflow/mnist-distributed/job.yml jobs/single-step/tensorflow/mnist-distributed/job Train a basic neural network with TensorFlow on the MNIST dataset, distributed via TensorFlow.
jobs/single-step/tensorflow/mnist/job.yml jobs/single-step/tensorflow/mnist/job Train a basic neural network with TensorFlow on the MNIST dataset.
jobs/basics/hello-code.yml jobs/basics/hello-code no description
jobs/basics/hello-iris-datastore-file.yml jobs/basics/hello-iris-datastore-file no description
jobs/basics/hello-iris-datastore-folder.yml jobs/basics/hello-iris-datastore-folder no description
jobs/basics/hello-iris-file.yml jobs/basics/hello-iris-file no description
jobs/basics/hello-iris-folder.yml jobs/basics/hello-iris-folder no description
jobs/basics/hello-iris-literal.yml jobs/basics/hello-iris-literal no description
jobs/basics/hello-mlflow.yml jobs/basics/hello-mlflow no description
jobs/basics/hello-notebook.yml jobs/basics/hello-notebook no description
jobs/basics/hello-pipeline-abc.yml jobs/basics/hello-pipeline-abc no description
jobs/basics/hello-pipeline-io.yml jobs/basics/hello-pipeline-io no description
jobs/basics/hello-pipeline-settings.yml jobs/basics/hello-pipeline-settings no description
jobs/basics/hello-pipeline.yml jobs/basics/hello-pipeline no description
jobs/basics/hello-sweep.yml jobs/basics/hello-sweep Hello sweep job example.
jobs/basics/hello-world-env-var.yml jobs/basics/hello-world-env-var no description
jobs/basics/hello-world-input.yml jobs/basics/hello-world-input no description
jobs/basics/hello-world-org.yml jobs/basics/hello-world-org
jobs/basics/hello-world-output-data.yml jobs/basics/hello-world-output-data no description
jobs/basics/hello-world-output.yml jobs/basics/hello-world-output no description
jobs/basics/hello-world.yml jobs/basics/hello-world no description
jobs/pipelines-with-components/basics/1a_e2e_local_components/pipeline.yml jobs/pipelines-with-components/basics/1a_e2e_local_components/pipeline "Dummy train-score-eval pipeline with local components"
jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline.yml jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline "E2E dummy train-score-eval pipeline with registered components"
jobs/pipelines-with-components/basics/1c_e2e_inline_components/pipeline.yml jobs/pipelines-with-components/basics/1c_e2e_inline_components/pipeline "E2E dummy train-score-eval pipeline with components defined inline in pipeline job"
jobs/pipelines-with-components/basics/2a_basic_component/pipeline.yml jobs/pipelines-with-components/basics/2a_basic_component/pipeline "Hello World component example"
jobs/pipelines-with-components/basics/2b_component_with_input_output/pipeline.yml jobs/pipelines-with-components/basics/2b_component_with_input_output/pipeline "Component with inputs and outputs"
jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline.yml jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline "Basic Pipeline Job with 3 Hello World components"
jobs/pipelines-with-components/basics/3b_pipeline_with_data/pipeline.yml jobs/pipelines-with-components/basics/3b_pipeline_with_data/pipeline no description
jobs/pipelines-with-components/basics/4a_local_data_input/pipeline.yml jobs/pipelines-with-components/basics/4a_local_data_input/pipeline "Example of using data in a local folder as pipeline input"
jobs/pipelines-with-components/basics/4b_datastore_datapath_uri_folder/pipeline.yml jobs/pipelines-with-components/basics/4b_datastore_datapath_uri_folder/pipeline "Example of using data folder from a Workspace Datastore as pipeline input"
jobs/pipelines-with-components/basics/4c_datastore_datapath_uri_file/pipeline.yml jobs/pipelines-with-components/basics/4c_datastore_datapath_uri_file/pipeline "Example of using data file from a Workspace Datastore as pipeline input"
jobs/pipelines-with-components/basics/4d_dataset_input/pipeline.yml jobs/pipelines-with-components/basics/4d_dataset_input/pipeline "Example of using data from a Dataset as pipeline input"
jobs/pipelines-with-components/basics/4e_web_url_input/pipeline.yml jobs/pipelines-with-components/basics/4e_web_url_input/pipeline "Example of using a file hosted at a web URL as pipeline input"
jobs/pipelines-with-components/basics/5a_env_public_docker_image/pipeline.yml jobs/pipelines-with-components/basics/5a_env_public_docker_image/pipeline no description
jobs/pipelines-with-components/basics/5b_env_registered/pipeline.yml jobs/pipelines-with-components/basics/5b_env_registered/pipeline no description
jobs/pipelines-with-components/basics/5c_env_conda_file/pipeline.yml jobs/pipelines-with-components/basics/5c_env_conda_file/pipeline no description
jobs/pipelines-with-components/basics/6a_tf_hello_world/pipeline.yml jobs/pipelines-with-components/basics/6a_tf_hello_world/pipeline "Prints the environment variable ($TF_CONFIG) useful for scripts running in a Tensorflow training environment"
jobs/pipelines-with-components/basics/6c_pytorch_hello_world/pipeline.yml jobs/pipelines-with-components/basics/6c_pytorch_hello_world/pipeline "Prints the environment variables useful for scripts running in a PyTorch training environment"

Endpoints (endpoints)

path status description

Resources (resources)

path status description
resources/compute/cluster-basic.yml resources/compute/cluster-basic no description
resources/compute/cluster-location.yml resources/compute/cluster-location no description
resources/compute/cluster-low-priority.yml resources/compute/cluster-low-priority no description
resources/compute/cluster-minimal.yml resources/compute/cluster-minimal no description
resources/compute/cluster-ssh-password.yml resources/compute/cluster-ssh-password no description

Assets (assets)

path status description
assets/dataset/cloud-file-https.yml assets/dataset/cloud-file-https Dataset created from a file in cloud using https URL.
assets/dataset/cloud-file-wasbs.yml assets/dataset/cloud-file-wasbs Dataset created from a file in cloud using wasbs URL.
assets/dataset/cloud-file.yml assets/dataset/cloud-file Dataset created from file in cloud.
assets/dataset/cloud-folder-https.yml assets/dataset/cloud-folder-https Dataset created from folder in cloud using https URL.
assets/dataset/cloud-folder-wasbs.yml assets/dataset/cloud-folder-wasbs Dataset created from folder in cloud using wasbs URL.
assets/dataset/cloud-folder.yml assets/dataset/cloud-folder Dataset created from folder in cloud.
assets/dataset/iris-csv-example.yml assets/dataset/iris-csv-example no description
assets/dataset/local-file.yml assets/dataset/local-file Dataset created from local file.
assets/dataset/local-folder.yml assets/dataset/local-folder Dataset created from local folder.
assets/dataset/public-file-https.yml assets/dataset/public-file-https Dataset created from a publicly available file using https URL.
assets/environment/docker-context.yml assets/environment/docker-context no description
assets/environment/docker-image-plus-conda.yml assets/environment/docker-image-plus-conda Environment created from a Docker image plus Conda environment.
assets/environment/docker-image.yml assets/environment/docker-image Environment created from a Docker image.
assets/model/local-file.yml assets/model/local-file Model created from local file.
assets/model/local-mlflow.yml assets/model/local-mlflow Model created from local MLflow model directory.

Contents

directory description
assets assets
endpoints endpoints
jobs jobs
resources resources

Contributing

We welcome contributions and suggestions! Please see the contributing guidelines for details.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. Please see the code of conduct for details.

Reference