Let's enable the kube-linter task in our pipeline.
-
Kube lint has a tekton task on Tekton Hub so let's grab it and add the
Task
to our cluster. Feel free to explore what theTask
will be doingcurl -sLo /projects/tech-exercise/tekton/templates/tasks/kube-linter.yaml \ https://raw.githubusercontent.com/tektoncd/catalog/main/task/kube-linter/0.1/kube-linter.yaml
# commit this so ArgoCD will sync it cd /projects/tech-exercise git add . git commit -m "鈽庯笍 ADD - kube-linter task 鈽庯笍" git push
-
We could run the kube-linter task with all default checks in our pipeline but this would fail the build. So let's do the naughty thing and run with a restricted set of checks. Add the following step in our
maven-pipeline.yaml
(stored in/projects/tech-exercise/tekton/templates/pipelines/maven-pipeline.yaml
).# Kube-linter - name: kube-linter runAfter: - fetch-app-repository taskRef: name: kube-linter workspaces: - name: source workspace: shared-workspace params: - name: manifest value: "$(params.APPLICATION_NAME)/$(params.GIT_BRANCH)/chart" - name: default_option value: do-not-auto-add-defaults - name: includelist value: "no-extensions-v1beta,no-readiness-probe,no-liveness-probe,dangling-service,mismatching-selector,writable-host-mount"
Be sure to update the
maven
task in the pipeline as well so itsrunAfter
is thekube-linter
task 馃挭馃挭馃挭鉀凤笍 NOTE 鉀凤笍 - If you've completed Sonarqube step, you need to set runAfter as analysis-check
You should have a pipeline definition like this:
- name: kube-linter runAfter: - fetch-app-repository ... - name: maven taskRef: name: maven runAfter: <== make sure you update this 馃挭馃挭 - kube-linter # check the NOTE above鉂椻潡 this could be `analysis-check` as well. params: - name: WORK_DIRECTORY value: "$(params.APPLICATION_NAME)/$(params.GIT_BRANCH)" ...
-
Check our changes into git.
cd /projects/tech-exercise # git add, commit, push your changes.. git add . git commit -m "馃悺 ADD - kube-linter checks 馃悺" git push
-
Trigger a pipeline build.
cd /projects/pet-battle-api git commit --allow-empty -m "馃悺 test kube-linter step 馃悺" git push
馃獎 Watch the pipeline run with the kube-linter task.
Let's run through a scenario where we break/fix the build with kube-linter.
-
Edit
maven-pipeline.yaml
again and add required-label-owner to the includelist list on the kube-linter task:- name: includelist value: "no-extensions-v1beta,no-readiness-probe,no-liveness-probe,dangling-service,mismatching-selector,writable-host-mount,required-label-owner"
-
Check in these changes and trigger a pipeline run.
cd /projects/tech-exercise # git add, commit, push your changes.. git add . git commit -m "馃悺 ADD - kube-linter required-label-owner check 馃悺" git push
If you get an error like error: failed to push some refs to.., please run git pull, then push your changes again by running above commands.
Make an empty commit to trigger the pipeline.
cd /projects/pet-battle-api git commit --allow-empty -m "馃┐ test required-label-owner check 馃┐" git push
-
Wait for the pipeline to sync and trigger a pet-battle-api build. This should now fail.
-
We can take a look at the error and replicate it on the command line:
cd /projects/pet-battle-api kube-linter lint chart --do-not-auto-add-defaults --include no-extensions-v1beta,no-readiness-probe,no-liveness-probe,dangling-service,mismatching-selector,writable-host-mount,required-label-owner
-
The linter is complaining we're missing a label on our resources - let's fix our deployment by adding an owner label using helm. Edit
pet-battle-api/chart/values.yaml
file and add a value for owner:# Owner value owner: <TEAM_NAME>
-
In helm land, the
_helpers.tpl
file allows us to define variables and chunks of yaml that can be reused across all resources in a chart easily. Let's update our label definitions in there to fix the kube-lint issue. Editpet-battle-api/chart/templates/_helpers.tpl
and add theowner
label like this in two places - where we define "pet-battle-api.labels" and where we define "mongodb.labels" append it belowapp.kubernetes.io/managed-by: {{ .Release.Service }}
owner: {{ .Values.owner }}
So it looks like this:
... {{- end }} app.kubernetes.io/managed-by: {{ .Release.Service }} owner: {{ .Values.owner }} {{- end }} ...
-
We can now trigger the Pipeline with the new version. Edit pet-battle-api
pom.xml
found in the root of thepet-battle-api
project and update theversion
number. The pipeline will update thechart/Chart.yaml
with these versions for us. Increment and change the version number to suit.<artifactId>pet-battle-api</artifactId> <version>1.3.2</version>
You can also run this bit of code to do the replacement if you are feeling uber lazy!
cd /projects/pet-battle-api mvn -ntp versions:set -DnewVersion=1.3.2
-
We can check the kube-linter command again and check these changes in:
cd /projects/pet-battle-api git add . git commit -m "馃悐 ADD - kube-linter owner labels 馃悐" git push
馃獎 Observe the pet-battle-api pipeline running successfully again.