Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix get-kube authorization headers #88383

Merged
merged 1 commit into from Feb 21, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
15 changes: 2 additions & 13 deletions cluster/get-kube-binaries.sh
Expand Up @@ -143,17 +143,6 @@ function sha1sum_file() {
fi
}

# Get default service account credentials of the VM.
GCE_METADATA_INTERNAL="http://metadata.google.internal/computeMetadata/v1/instance"
function get-credentials {
curl "${GCE_METADATA_INTERNAL}/service-accounts/default/token" -H "Metadata-Flavor: Google" -s | python -c \
'import sys; import json; print(json.loads(sys.stdin.read())["access_token"])'
}

function valid-storage-scope {
curl "${GCE_METADATA_INTERNAL}/service-accounts/default/scopes" -H "Metadata-Flavor: Google" -s | grep -E "auth/devstorage|auth/cloud-platform"
}

function download_tarball() {
local -r download_path="$1"
local -r file="$2"
Expand All @@ -168,8 +157,8 @@ function download_tarball() {
# if the url belongs to GCS API we should use oauth2_token in the headers
curl_headers=""
if { [[ "${KUBERNETES_PROVIDER:-gce}" == "gce" ]] || [[ "${KUBERNETES_PROVIDER}" == "gke" ]] ; } &&
[[ "$url" =~ ^https://storage.googleapis.com.* ]] && valid-storage-scope ; then
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we ever needed the valid-storage-scope check, if the SA doesn't have scope then we pass no auth, so the resource must be public read ...?

This could (?) bite us later, but only for non-public resources (security patch testing?), if it does I expect to have fun figuring out a follow up with @krzyzacy ;-)

We should not be using the metadata server for this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(if and only if you enable workload identity then the metadata server is part of the pod and it is safe to use)

Otherwise this makes jobs non-portable as they have implicit dependencies on the details of the node the pod gets scheduled on (which we do not want).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the previous check asked if the default SA for the metadata server had storage scopes before passing the auth header, if it doesn't it passes no headers.

I'm not sure that makes sense regardless of workload identity. That would mean you would get an error about not being authenticated instead of an error about the auth not having IAM permission.

I'm not sure this ever made sense.

curl_headers="Authorization: Bearer $(get-credentials)"
[[ "$url" =~ ^https://storage.googleapis.com.* ]]; then
curl_headers="Authorization: Bearer $(gcloud auth print-access-token)"
fi
curl ${curl_headers:+-H "${curl_headers}"} -fL --retry 3 --keepalive-time 2 "${url}" -o "${download_path}/${file}"
elif [[ $(which wget) ]]; then
Expand Down
11 changes: 2 additions & 9 deletions cluster/get-kube.sh
Expand Up @@ -122,13 +122,6 @@ function create_cluster {
)
}

# Get default service account credentials of the VM.
GCE_METADATA_INTERNAL="http://metadata.google.internal/computeMetadata/v1/instance"
function get-credentials {
curl "${GCE_METADATA_INTERNAL}/service-accounts/default/token" -H "Metadata-Flavor: Google" -s | python -c \
'import sys; import json; print(json.loads(sys.stdin.read())["access_token"])'
}

function valid-storage-scope {
curl "${GCE_METADATA_INTERNAL}/service-accounts/default/scopes" -H "Metadata-Flavor: Google" -s | grep -E "auth/devstorage|auth/cloud-platform"
}
Expand Down Expand Up @@ -242,8 +235,8 @@ if "${need_download}"; then
# if the url belongs to GCS API we should use oauth2_token in the headers
curl_headers=""
if { [[ "${KUBERNETES_PROVIDER:-gce}" == "gce" ]] || [[ "${KUBERNETES_PROVIDER}" == "gke" ]] ; } &&
[[ "$kubernetes_tar_url" =~ ^https://storage.googleapis.com.* ]] && valid-storage-scope ; then
curl_headers="Authorization: Bearer $(get-credentials)"
[[ "$kubernetes_tar_url" =~ ^https://storage.googleapis.com.* ]] ; then
curl_headers="Authorization: Bearer $(gcloud auth print-access-token)"
fi
curl ${curl_headers:+-H "${curl_headers}"} -fL --retry 3 --keepalive-time 2 "${kubernetes_tar_url}" -o "${file}"
elif [[ $(which wget) ]]; then
Expand Down