Skip to content

Latest commit

 

History

History
263 lines (226 loc) · 12.2 KB

File metadata and controls

263 lines (226 loc) · 12.2 KB

Azure Pipelines

  • Continuous integration & delivery service
  • You can group pipeline steps into jobs, and jobs into stages.
    • Building, testing, and release can be automated using stages
  • Before Azure had 2 different type of pipelines:
    • Build pipeline that results in artifact (.yaml file)
    • Release pipeline that takes the artifact and pushes it (UI only)
      • It's now the "old way" and new way only uses .yaml same pipeline for both
  • A job is a series of tasks that run sequentially on the same agent
    • It has tasks, e.g. install packages & build solution
      • Manual Intervention task
        • Pauses an active workflow to do a manual task, can have instructions
        • Can continue or reject deployment on time out
    • Build is done on an agent (VM or a container), you can design your own agents.
    • Timeouts before job cancelled:
      • Forever on self-hosted agents
      • 360 minutes (6 hours) on Microsoft-hosted agents with public project or repository
      • 60 minutes on Microsoft-hosted agents with a private project or repository.
  • Each pipeline has a source version control system
    • Can be GitHub, GitHub Enterprise, Azure Repos Git & TFVC, Bitbucket Cloud, Subversion
      • 💡 GitHub App is the recommended authentication method with GitHub.
    • You also have option called "Other Git" to e.g. use an on-prem git server.
  • Has native support for Python, Java, JavaScript, PHP, Ruby, C#, C++ and Go
  • Using secrets:
    • ❗ Don't set secret variables in your YAML file.
    • Use the script's environment in order to pass secrets to the script.
    • Operating systems often log commands for the processes that they run, and you wouldn't want the log to include a secret that you passed in as an input.
  • Supports visualization of Cobertura & JaCoCo test coverage tools.
    • ❗ Only those 2!
  • Use the secure files library to store files such as signing certificates, SSH keys on the server.
  • You can use runtime parameters to prompt for input when running manually.
  • Retention policy
    • Used to configure how long runs and releases are to be retained by the system
    • 💡 You can use the Copy Files task to save your build and artifact data for longer than what is set in the retention policies

Variables

Variable

  • Store values used in the pipeline

  • Can be scoped to specific stages.

  • There are some predefined build variables

    • E.g. System.HostType, Build.ArtifactStagingDirectory, Agent.HomeDirectory .
  • Usage

      variables:
        configuration: debug
      steps:
      - task: MSBuild@1 # Use them once
        inputs:
          configuration: $(configuration) # Use the variable
      - task: MSBuild@1 # Use them again
        inputs:
          configuration: $(configuration) # Use the variable

Variable Groups

  • to store values that available across multiple pipelines.
  • also to store secrets and other values that might need to be passed into a YAML pipeline.
    • Usage:

        variables:
          - group: my-variable-group
        steps:
          - script: echo $(myhello) # uses macro syntax
          - script: echo $[variables.myhello] # uses runtime expression

Triggers

  • Pipelines are triggered by triggers e.g. commits to repository, webhooks etc.
  • Defined in .yaml file but ❗ can override YAML continuous integration trigger in pipeline settings.
  • Batch changes
    • When a build is running, the system stacks other changes and build them all at once.

    • In yaml:

        trigger:
          batch: false

Filters

  • You can filter triggers by path, branch or tag.
    • E.g.:

        trigger:
          branches:
            include: [ 'master', 'release/*' ]
          paths:
            exclude: [ '/Code/Previous' ]
          tags:
            include: [ '*' ]

Pipeline as code

  • Triggers, agent, all tasks are defined as a yaml.
    • It's version-controlled in the repository.

    • Schema:

        name: string  # build numbering format
        resources:
        pipelines: [ pipelineResource ]
        containers: [ containerResource ]
        repositories: [ repositoryResource ]
        variables: # several syntaxes, see specific section
        trigger: trigger
        pr: pr
        stages: [ stage | templateReference ]
      • You can organize steps in jobs and organize jobs as stages.
        • If you have single stage you can define jobs jobs: [ job | templateReference]
        • If you have single job you can define steps: steps: [ script | bash | pwsh | powershell | checkout | task | templateReference ]
    • Example:

        name: $(Date:yyyyMMdd)$(Rev:.r)
        variables:
        var1: value1
        jobs:
          - job: One
        steps:
          - script: echo First step!

Agents

  • You can choose Microsoft agents.
    • Maintenance & upgrades are taken care by Microsoft
    • On each run it's a new VM/container & discarded after one use
  • You can also build your own agent as self-hosted agent.
    • You control the VM & container
    • Can run in Azure or on-prem
    • You need to create Personal Access Token with Agent Pool scope to configure your agent.
    • 💡 Good way to integrate on-prem & cloud systems

Agent pools

  • Organize agents into agent pools for easier management.
  • Pools are scoped to & visible in the entire organization
  • Default agent pools
    • Default pool: Use it to register self-hosted agents that you've set up.
    • Azure Pipelines hosted pool with various Windows, Linux, and macOS images.
      • Examples 📝:
        • Ubuntu 1604: Run jobs on a Linux based VM
        • macOS: Build & release on Mojave macOS
          • ❗ For builds and releases running on Microsoft-provided macOS agents, your data will be transferred to third party data centers in US or EU (offical docs).
        • Windows:
          • Windows 2019 with VS2019
          • VS2017
          • Hosted: Older versions of Visual Studio installed on Windows Server 2012
          • Hosted Windows Container
            • Removed after March 23, 2020
        • The list here can be outdated, check the latest from the docs
  • 📝 Permissions:
    • Organization- or collection-level:
      • Reader: can view agent pool & its agents to e.g. monitor health
      • Service Accounts: Can create an agent pool in a project
      • Administrator: Can also register & unregister agents from an organization agent pool
    • Project-level:
      • Reader: can view agent pool & its agents to e.g. monitor health
      • User: Can use the pool when authoring pipelines.
      • Creator: Can use the pool when authoring build or release pipelines.
      • Administrator: Can manage membership for all roles of the pool, as well as view and use the pools

Static Code Analyzer tools in Azure Pipelines

  • You can run source code analyzer tools as part of your build pipeline.
  • For Java based projects you have tools such as PMD, CheckStyle and FindBugs
  • In Azure Pipelines yaml file, use maven task template and simply give three parameters:
    • checkStyleRunAnalysis: true, pmdRunAnalysis: true, findBugsRunAnalysis: true
    • In build summary you can then see information in Code Analysis Report section
      • Analysis results are also published as an artifact.

Environment

  • Collection of resources that can be targeted by deployments from a pipeline.

    • E.g. VMs in secondary and primary site
      • You run a script in each VM to register them in an environment
  • Can include Kubernetes clusters, Azure web apps, virtual machines, databases.

  • E.g. Dev, Test, QA, Staging, and Production.

  • E.g. following runs for each resource in the environment:

      - stage: deploy
        jobs:
        - deployment: DeployWeb
            displayName: deploy Web App
            pool:
              vmImage: 'Ubuntu-latest'
            # creates an environment if it doesn't exist
            environment: 'smarthotel-dev'
            strategy:
            runOnce:
              deploy:
                steps:
                  - script: echo Hello world

Approvals

  • Manually control when a stage should run using approval from user(s)
  • Can be pre-deployment, or post-deployment approvals
  • Defined in environment level or (legacy) release pipeline UI
  • 💡 Commonly used to control deployments to production environments.
  • You can assign a timeout for the approval for auto-rejection after time is out.

Parallel Jobs

  • Parallel job = One job at a time in a pipeline
    • Pipeline = collection of jobs
    • Each job consumes a parallel job that runs on an agent
  • When there aren't enough parallel jobs available for your organization
    • the jobs are queued up and run one after the other.
  • Free tier
    • Public project:
      • 10 free Microsoft-hosted parallel jobs that can run up to 360 minutes each time
      • 10 free self-hosted parallel jobs (unlimited parallel jobs)
    • Private project:
      • 1 free Microsoft hosted parallel job, can run up to 60 minutes each time
        • Limit of 30 hours per month
      • 1 free self-hosted parallel job (unlimited parallel jobs)

Service connections

  • Allows you to connect to external and remote services to execute tasks in a job.
  • Service connections are created at project scope
    • So a service connection created in one project is not visible in another project.
  • Common examples:
    • ARM, Bitbucket Cloud, Chef, Docker Registry, GitHub, External Git, Jenkins, Kubernetes, npm, SSH, Subversion, Visual Studio App Center...
    • Generic service connection to any other type of service or application
      • Consists of "Connection Name", "Server URL", "User name" and "Password/Token Key" parameters
  • You can secure a service connection in three different levels of permissions:
    1. User: Creator, Reader, User, Administrator
    2. Pipeline: control which YAML pipelines are authorized to use this service connection
    3. Project: allows cross project sharing of service connections
  • You can create custom service connections using service endpoints
    • Consists of "Service name", "Description", "Server URL", "Certificates or tokens", "User names" and "passwords"
    • See official docs for more information.

Integrate with third party