Skip to content

Latest commit

 

History

History
116 lines (61 loc) · 6.93 KB

README.md

File metadata and controls

116 lines (61 loc) · 6.93 KB

Automating Deployments

Azure DevOps has lots of built-in tasks you can use to simplify your pipelines - you can build .NET apps, build Docker images, work with Key Vault storage. Those are all great but they mean your pipeline only works when you run it in Azure DevOps. An alternative approach is to put all your logic in PowerShell or Bash scripts in your source repo, and just use the pipeline to run those scripts. You don't make full use of all the features in Azure DevOps but it means you can run the scripts locally and if you move to a different service like GitHub Actions, you don't need to rewrite all your logic.

Reference

Scripts and Pipeline Parameters

To make your pipelines as flexible as possible, you'll want to parameterize them so you can run the same pipeline with different settings. The pipeline syntax separates parameters which appear in the UI and variables which can be computed. Variable values can be copied from parameters, and they get surfaced as environment variables inside scripts:

📋 Create a new pipeline using this definition and run it. What is the job output?

Not sure?

Remember new pipelines need to be created in the DevOps UI. Open the Pipelines menu and click New pipeline. Follow the setup to use your git repo in Azure DevOps and set the path to labs/pipelines/pipelines/parameters.yml.

When you first create a pipeline you can run it, but you don't see the normal UI and it will fail because all parameters are required and one of them doesn't have a default value in the YAML.


Click to run the pipeline again. How do the parameters show in the UI? When you set them all and run the pipeline, you'll see the output. It just prints the values you selected, but it shows how pipeline parameters can be propagated to PowerShell scripts.

Use the Azure CLI in a Pipeline

Now we're ready to start building pipelines which run scripts to create and manage Azure resources with the Azure CLI.

But first we need to connect Azure DevOps with our Azure Subscription, so we can authenticate the az command and our scripts have permission to create resources.

You do that by creating a service connection in Azure DevOps. Follow this guide to create an Azure Resource Manager service connection and call it spn-az-devops

📋 Open the service connection details in Azure DevOps - there's a link to see the security roles in the Azure Portal. What role does the service connection have?

Not sure?

Open Project settings and then Service connections. Under Azure Resource Manager you'll see your connection and you can click Manage service connection roles to go to the Azure Portal.

You're taken to the generic roles page, which isn't very helpful. Click on Role Assignments and search for dotnetaz. You'll see your service connection with a random name containing the project name.

It's been assigned the Contributor role


Automatically creating a service connection sets up a generic role for you. If you need more control you can manually create the identity in Azure first and then associate it with your DevOps project - that lets you assign more restricted or more generous roles.

The Azure CLI task is one you do want to use - it runs a PowerShell or Bash script, but it authenticates the Azure CLI first using your service connection:

📋 Create and run a new pipeline using this definition. Does it authenticate to Azure correctly?

Not sure?

Create a new pipeline and set the path to labs/pipelines/pipelines/run-azure-cli.yml.

When the run completes you should see the list of your Azure subscriptions in the output.


The service connection has permission to create resources, so everything we've done with az commands we can now do in pipelines.

Provisioning Azure resources in a Pipeline

This script creates the resources we would need to deploy a containerized application:

If you're not familiar with PowerShell, it should still be fairly clear what's happening. We create an RG, then an ACR instances to store images and then an AKS cluster. The echo commands will print friendly output in the job and because all the options use environment variables we can run the same script locally.

This pipeline sets up parameters for all the variables and then runs the PowerShell script within an Azure CLI task:

📋 Create a new pipeline using this definition. Run it and supply parameters to create a 2-node cluster. Monitor the output to check everything runs.

Not sure how?

Create a new pipeline and set the path to labs/pipelines/pipelines/create-services.yml.

When you run it you can choose a region and VM size as well as selecting the number of nodes.


The job should run successfully or if it fails you'll see a clear error message in the job logs. When it's run you can browse to the Azure Portal and you'll see your new RG - if you didn't change the name in the pipeline parameters it will be labs-pipeline. In there you'll see the ACR instance and the AKS cluster.

Lab

This setup isn't quite ready to use though. You can push Docker images to your ACR instance if you login, but the AKS cluster needs to be configured so it can use that ACR instance to pull images. Check the az acr commands to see how you can attach the ACR instance to the AKS cluster and add the command you need to the create-services.ps1 script. Run the pipeline again - does it work?

Stuck? Try hints or check the solution.


Cleanup

Don't delete the RGs if you're continuing with the hackathon - if you do then there will be no VMs for Azure DevOps to use for pipeline runs, and no AKS cluster to deploy to

If you're not continuing, then you can delete the RG that the pipeline created and the RG with the DevOps VMs:

az group delete -y -n labs-devops

az group delete -y -n labs-pipeline