Find file History

README.md

Kubernetes Cluster in Google Kubernetes Engine (GKE)

Terraform configuration for deploying a Kubernetes cluster in the Google Kubernetes Engine (GKE) in the Google Cloud Platform (GCP).

Introduction

This Terraform configuration deploys a Kubernetes cluster into Google's managed Kubernetes service, Google Kubernetes Engine (GKE). It replicates what a GCP customer could do with the gcloud container clusters create CLI command.

It uses Google Cloud Provider's google_container_cluster resource to create an entire Kubernetes cluster in GKE including required VMs, networks, and other GCP constructs.

This Terraform configuration gets the GCP credentials from a Vault server.

Deployment Prerequisites

  1. Sign up for a free Google Cloud Platform account.
  2. Follow the instructions on Google's Kubernetes Engine Quickstart page to create or select a project in your account, enable the Google Kubernetes Engine API in your project, and enable billing for your project. When creating your project, we recommend using a globally unique project name so that your project name and project ID will be identical. This will avoid confusion when adding your GCP credentials file to Vault.
  3. Follow these instructions to download an authentication JSON file for your project which Terraform will use when provisioning resources to your GCP project.
  4. Set up a Vault server if you do not already have access to one and determine your username, password, and associated Vault token. See the Vault Provisioning Guide for a options for setting up Vault servers.
  5. We assume that the Userpass auth method is enabled on your Vault server. If not, that is ok. You will login to the Vault UI with your Vault token instead of with your username. Wherever the Terraform-specific instructions below ask you to specify your Vault username, just make one up for yourself.
  6. Your Vault username and token will need to have a Vault policy like sample-policy.hcl associated with them. You could use this one after changing "roger" to your username and renaming the file to <username>-policy.hcl. Run vault write sys/policy/<username> policy=@<username>-policy.hcl to import the policy to your Vault server. Then run vault write auth/userpass/users/<username> policies="<username>" to associate the policy with your username. (If you already have other policies associated with the user, then be sure to include those policies in the list of policies with commas between them.) To create a new token and associate the policy with it, run vault token-create -display-name="<username>-token" -policy="<username>".
  7. Login to the UI of your Vault server or use the Vault CLI to paste the contents of your GCP authentication JSON file into secret/<vault_username>/gcp/credentials. Note that this is the path to the secret and that the entire contents of the file will be be added to a single key equal to your GCP project ID underneath this single secret. If using the vault CLI, you would use vault write secret/<vault_username>/gcp/credentials <project_id>=<project_auth_json_contents>, providing the actual contents of the JSON file for value of the key. Ideally, you will have created a GCP project with a globally unique name so that the project name and the project ID are identical. If they differ, be sure to use the project ID, not the project Name.
  8. If you do not already have a Terraform Enterprise (TFE) account, request one from sales@hashicorp.com.
  9. After getting access to your TFE account, create an organization in it. Click the Cancel button when prompted to create a new workspace.
  10. Configure your TFE organization to connect to GitHub. See this doc.

Deployment Steps

Execute the following commands to deploy your Kubernetes cluster to GKE.

  1. Fork this repository by clicking the Fork button in the upper right corner of the screen and selecting your own personal GitHub account or organization.
  2. Clone the fork to your laptop by running git clone https://github.com/<your_github_account>/terraform-guides.git.
  3. Create a workspace in your TFE organization called k8s-cluster-gke.
  4. Configure the k8s-cluster-gke workspace to connect to the fork of this repository in your own GitHub account.
  5. Click the "More options" link, set the Terraform Working Directory to "infrastructure-as-code/k8s-cluster-gke".
  6. On the Variables tab of your workspace, add the following variables to the Terraform variables:
    gcp_project           # The name of the GCP project you are using
    gcp_region            # Valid GCP Region e.g. us-east1
    gcp_zone              # Valid GCP Zone e.g. us-east1-b
    initial_node_count    # Default 1
    node_machine_type     # Default n1-standard-1
    environment           # Should be dev.  Could be any other value needed, but make sure to align environments properly
    vault_addr            # Address of Vault server.  e.g. http://<vault_server_dns/ip>:8200
    vault_user            # Username to login as to add secrets.
    
  7. Set the VAULT_TOKEN environment variable to your Vault token. Be sure to mark the VAULT_TOKEN variable as sensitive so that other people cannot read it.
  8. Click the "Queue Plan" button in the upper right corner of your workspace.
  9. On the Latest Run tab, you should see a new run. If the plan succeeds, you can view the plan and verify that the GKE cluster will be created when you apply your plan.
  10. Click the "Confirm and Apply" button to actually provision your GKE cluster.

You will see outputs representing the URLs to access your GKE cluster in the Google Console, the FQDN of your cluster, TLS certs/keys for your cluster, the Vault Kubernetes authentication backend, the Vault address, and your Vault username. You will need these when using Terraform's Kubernetes Provider to provision Kubernetes pods and services in other workspaces that use your cluster. However, if you configure a workspace against the Terraform code in the k8s-services directory of this repository to provision your pods and services, the outputs will automatically be used by that workspace.

You can also validate that the cluster was created in the Google Console.

Cleanup

Execute the following steps for your workspaces to delete your Kubernetes cluster and associated resources from GKE.

  1. On the Variables tab of your workspace, add the environment variable CONFIRM_DESTROY with value 1.
  2. At the bottom of the Settings tab of your workspace, click the "Queue destroy plan" button to make TFE do a destroy run.
  3. On the Latest Run tab of your workspace, make sure that the Plan was successful and then click the "Confirm and Apply" button to actually destroy your GKE cluster and other resources that were provisioned by Terraform.
  4. If for any reason, you do not see the "Confirm and Apply" button even though the Plan was successful, please delete your cluster from inside the Google Console. Doing that will destroy all the resources that Terraform provisioned.