-
Create a Service Principal for Terraform named
TerraformSP
by:az ad sp create-for-rbac --role="Contributor" --name="TerraformSP"
, and such command outputs 5 values:appId
,displayName
,name
,password
, andtenant
.Insert these information to terraform/environments/test/main.tf, where
client_id
andclient_secret
areappId
andpassword
, respectively, as well astenant_id
is Azure Tenant ID andsubscription_id
is the Azure Subscription ID and both can be retrieved from the commandaz account show
(subscription_id
is theid
key ofaz account show
).provider "azurerm" { tenant_id = "${var.tenant_id}" subscription_id = "${var.subscription_id}" client_id = "${var.client_id}" client_secret = "${var.client_secret}" features {} }
Since it is not a good practice to expose these sensitive Azure account information to the public github repo, we define them in the terraform variable definition file terraform/environments/test/terraform.tfvars and avoid tracking such file to the repo. Instead, we upload such file to Pipelines >> Library >> Secure files and download it in the Azure Pipelines YAML config file azure-pipelines.yml with a DownloadSecureFile@1 task.
-
Configure the storage account and state backend.
For the sake of simplicity, run the bash script config_storage_account.sh in the local computer. Then replace the values below in terraform/environments/test/main.tf with the output from the Azure CLI in a block as
terraform { backend "azurerm" { resource_group_name = "${var.resource_group}" storage_account_name = "tstate12785" container_name = "tstate" key = "terraform.tfstate" } }
-
Fill in the correct information in terraform/environments/test/main.tf and the corresponding modules.
-
Install Terraform Azure Pipelines Extension by Microsoft DevLabs.
-
Create a new Service Connection by Project Settings >> Service connections >> New service connection >> Azure Resource Manager >> Next >> Service Principal (Automatic) >> Next >> Choose the correct subscription, and name such new service connection to Azure Resource Manager as
azurerm-sc
. This name will be used in azure-pipelines.yml. -
Add TerraformTaskV1@0 tasks to perform
terraform init
andterraform apply
in azure-pipelines.yml to let them run in the Azure Pipelines build agent as if running in the local computer. -
Build FakeRestAPI artifact by archiving the entire fakerestapi directory into a zip file and publishing the pipeline artifact to the artifact staging directory.
-
Deploy FakeRestAPI artifact to the terraform deployed Azure App Service. The deployed webapp URL is https://nqualityapp-appservice.azurewebsites.net/ where
nqualityapp-appservice
is the Azure App Service resource name in small letters. -
After terraform deployed the virtual machine in Azure Pipelines, we need to manually register such virtual machine in Pipelines >> Environments >> TEST >> Add resource >> Select "Virtual machines" >> Next >> In Operating system, select "Linux". Then copy the Registration script, manually ssh login to the virtual machine, paste it in the console and run. Such registration script makes the deployed Linux virtual machine an Azure Pipelines agent so Azure Pipelines can run bash commands there.
Then Azure Pipelines can run bash commands on the virtual machine deployed by terraform.
-
Create an Azure Log Analytics workspace.
Run deploy_log_analytics_workspace.sh, or directly call
az deployment group create --resource-group udacity-ensuring-quality-releases --name deploy-log --template-file deploy_log_analytics_workspace.json
, and provide a string value for the parameterworkspaceName
, sayudacity-ensuring-quality-releases-log
. -
Install Log Analytics agent on Linux computers.
Follow the instructions to install the agent using wrapper script:
az vm extension set --resource-group nouman_quality --vm-name nqualityapp-vm --name OmsAgentForLinux --publisher Microsoft.EnterpriseCloud.Monitoring --protected-settings "{'workspaceKey':'<YOUR WORKSPACE PRIMARY KEY>'}" --settings "{'workspaceId':'<YOUR WORKSPACE ID>'}"
on the terraform deployed VM.Both ID and primary key of the Log Analytics Workspace can be found in the Settings >> Agents management of the Log Analytics workspace and they can be set as secret variables for the pipeline.
After finishing installing the Log Analytics agent on the deployed VM, Settings >> Agents management should indicate that "1 Linux computers connected".
-
Collect custom logs with Log Analytics agent in Azure Monitor.
-
Verify Azure Monitor Logs collected from the Log Analytics agent installed on the deployed VM.
-
Notifications
You must be signed in to change notification settings - Fork 0
noumanullah/IaC-Automation-Testing
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published