Skip to content
This is a quick sample showing how you can deploy Kubernetes on Azure with Azure Log analytics all from terraform
Branch: master
Clone or download
lawrencegripper Merge pull request #3 from lawrencegripper/spcreate
Create SP for AKS in TF: Offer option for restricting permissions
Latest commit 7f63d1e Aug 8, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
oms Move to creating SP in TF /w least privilidge Aug 6, 2018
service_principal fix formatting Aug 8, 2018
LICENSE Create LICENSE Mar 16, 2018
Makefile Fix Makefile Aug 8, 2018 Updated readme and added redis Mar 16, 2018
variables.example.tfvars Fix ssh key issue. Remove using users key by default Aug 8, 2018

Kubernetes on Azure using Terraform

Build Status

This project aims to show a simple example of how you can setup a fully featured k8s cluster on Azure using terraform.

What does it create?

The deploys a resourcegroup in which an aks cluster, log analytics workspace, managed redis cache and a container monitoring solution are added.

Then the connection details from the redis and the log analytics workspace are injected into the Kuberentes cluster as Secrets and a Deamonset is created to host the container monitoring solution agent.

A Service Principal is also created for use by the Kubernetes cluster.


Required Tooling

  • Terraform
  • Azure CLI
  • Community Kubernetes provider v1.0.7

Note: Currently the Hashicorp maintained k8s provider is missing some k8s resource types, such as Daemon-Sets, luckily there is a fork maintained with these additional resources. In future, once the hashicorp provider is updated, this requirement can be dropped.


  1. Login to the Azure CLI az login
  2. Clone this repository and cd into the directory
  3. Create a varaibles.tfvars file and add an ssh key and username for logging into k8s agent nodes.
linux_admin_username = ""

linux_admin_ssh_publickey = "ssh-rsa AAAasdfasdc2EasdfasdfAAABAQC+b42lMQef/l5D8c7kcNZNf6m37bdfITpUVcfakerFT/UAWAjym5rxda0PwdkasdfasdfasdfasdfVspDGCYWvHpa3M9UMM6cgdlq+R4ISif4W04yeOmjkRR5j9pcasdfasdfasdfW6PJcgw7IyWIWSONYCSNK6Tk5Yki3N+nAvIxU34+YxPTOpRw42w1AcuorsomethinglikethisnO15SGqFhNagUP/wV/18fvwENt3hsukiBmZ21aP8YqoFWuBg3 james@something"
  1. Download the Kuberentes provider by running (or mac, windows)
  2. Run terraform init then terraform plan -var-file=variables.tfvars to see what will be created... finally if it looks good run terraform apply -var-file=variables.tfvars to create your cluster
  3. Then run az aks list and az aks get-credentials to access your cluster


Least privilidge

The sp_least_privilidge option means the Service Principal used by AKS is configured to a limited set of permissions. This is experimental and untested. Only use this setting if you're happy to be suprised. Also note that AKS assigns the contributor role to the SP on the MC_* resource group so this role needs to be manually removed after the TF template has run.


  1. Why haven't you used modules to organize the template? We'd suggest using them but to keep things simple, and easy readable for those new to Terraform, we haven't included them. I changed my mind on this and now use modules for some components.

  2. I receive the error Error: kubernetes_daemonset.container_agent: Provider doesn't support resource: kubernetes_daemonset: Delete the .terraform folder from the directory then make sure you have downloaded the community edition of the kubernetes provider and it is named correctly stored in the current directory. In the root dir run rm -r .terraform then rerun the correct bootstrap script.

  3. I receive the error * provider.azurerm: No valid (unexpired) Azure CLI Auth Tokens found. Please run az login.: Run any az command which talks to Azure and it will update the token. For example run az group list then retry the Terraform command.

You can’t perform that action at this time.