Skip to content
This repository has been archived by the owner on Oct 24, 2023. It is now read-only.

chore: add support for k8s version 1.11.8 #615

Merged
merged 3 commits into from Mar 1, 2019

Conversation

mboersma
Copy link
Member

Reason for Change:

See https://github.com/kubernetes/kubernetes/blob/master/CHANGELOG-1.11.md#changelog-since-v1117

Issue Fixed:

Requirements:

  • uses conventional commit messages
  • includes documentation
  • adds unit tests
  • tested upgrade from previous version
  • Windows assets zip file uploaded to production blob store

Notes:

@codecov
Copy link

codecov bot commented Feb 28, 2019

Codecov Report

Merging #615 into master will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##           master     #615   +/-   ##
=======================================
  Coverage   56.69%   56.69%           
=======================================
  Files          91       91           
  Lines       13905    13905           
=======================================
  Hits         7884     7884           
  Misses       5355     5355           
  Partials      666      666

@@ -95,7 +95,7 @@ for TILLER_VERSION in ${TILLER_VERSIONS}; do
pullContainerImage "docker" "gcr.io/kubernetes-helm/tiller:v${TILLER_VERSION}"
done

CLUSTER_AUTOSCALER_VERSIONS="1.13.1 1.12.2 1.3.4 1.3.3 1.2.2 1.1.2"
CLUSTER_AUTOSCALER_VERSIONS="1.13.1 1.12.2 1.3.7 1.3.3 1.2.2 1.1.2"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's feel free to add component versions to VHD but retain old versions as well.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are no references to cluster-autoscaler 1.3.4 any longer, so I dropped it from the VHD list.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I guess 1.3.3 is also unused now.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed the removal--we need to audit AKS as well for usage before aging things out of the VHD image.

@jackfrancis
Copy link
Member

So we actually do want the cluster-autoscaler v1.3.7 in the VHD, but we don't need the k8s version itself as that modification already landed.

However, as we've already kicked off the VHD, what is the urgency of bumping cluster-autoscaler to 1.3.7 from 1.3.4?

@mboersma
Copy link
Member Author

what is the urgency of bumping cluster-autoscaler to 1.3.7 from 1.3.4?

Here are the changes: kubernetes/autoscaler@cluster-autoscaler-1.3.4...cluster-autoscaler-1.3.7. This in particular looks desirable: kubernetes/autoscaler@fae2e71

Copy link
Contributor

@CecileRobertMichon CecileRobertMichon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

@jackfrancis
Copy link
Member

lgtm pending E2E

@acs-bot
Copy link

acs-bot commented Feb 28, 2019

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: CecileRobertMichon, mboersma

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:
  • OWNERS [CecileRobertMichon,mboersma]

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@mboersma mboersma merged commit 2e0a9aa into Azure:master Mar 1, 2019
@mboersma mboersma deleted the add-k8s-1.11.8 branch March 1, 2019 01:05
mboersma added a commit that referenced this pull request Mar 1, 2019
juhacket pushed a commit to juhacket/aks-engine that referenced this pull request Mar 14, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants