Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Ops] Fix GCS bucket access for future buildkite agents #174756

Merged
merged 24 commits into from
Feb 7, 2024
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
430772d
chore: render gcloud credentials on pre-command
delanni Jan 9, 2024
8c6f64c
chore: activate privileged gcloud account with access to gcs buckets
delanni Jan 11, 2024
06d3171
chore: activate service account before storybook upload
delanni Jan 11, 2024
2b09a55
Merge branch 'main' into fix-gcs-access-in-elastic-buildkite
delanni Jan 25, 2024
1e09bd4
chore: introduce service account activation snippet, make use of it
delanni Jan 26, 2024
e40a31c
chore: fix vault_fns import
delanni Jan 26, 2024
af7818a
Merge branch 'main' into fix-gcs-access-in-elastic-buildkite
delanni Feb 1, 2024
bef73a9
chore: change service account script to work with impersonation
delanni Feb 2, 2024
8228842
chore: wire in service account impersonation before gsutil calls
delanni Feb 2, 2024
c672923
chore: fix vault_get usage
delanni Feb 2, 2024
9c0a021
chore: only activate gcloud account after secrets were read
delanni Feb 2, 2024
843488a
chore: add stack trace to failed vault_gets
delanni Feb 2, 2024
4040093
chore: don't activate sa-proxy account in pre-command, only before im…
delanni Feb 2, 2024
a8dacbb
chore: fix typo in account-switch call
delanni Feb 2, 2024
4ceb8a9
Merge branch 'main' into fix-gcs-access-in-elastic-buildkite
delanni Feb 5, 2024
8e3ec5f
chore: remove manifest upload from ES Serverless build - it's not use…
delanni Feb 5, 2024
b12ce66
chore: rename sa-proxy key's variable, set up bazel cache with a key …
delanni Feb 5, 2024
96caabe
chore: tidy up acitvate_service_account.sh
delanni Feb 5, 2024
d4e571b
chore: remove print_stack, it doesn't work as expected
delanni Feb 5, 2024
f1dbec2
chore: delete account activation for headless chrome builds, as it is…
delanni Feb 5, 2024
5a91a62
Merge branch 'main' into fix-gcs-access-in-elastic-buildkite
delanni Feb 5, 2024
46f7bed
chore: fix path to activating the correct service accounts
delanni Feb 6, 2024
9767b22
chore: import vault_fns always relatively
delanni Feb 6, 2024
c274d87
chore: log out, and de-impersonalize after commands
delanni Feb 6, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 80 additions & 0 deletions .buildkite/scripts/common/activate_service_account.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
#!/usr/bin/env bash

set -euo pipefail

source .buildkite/scripts/common/vault_fns.sh

ARG0="${1:-}"

if [[ -z "$ARG0" ]]; then
echo "Usage: $0 <bucket_name|email>"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I set up this script to make it easy to set up the impersonation before calling gcloud storage / gsutil commands. However, I didn't want the script caller to have to refer to e-mails that are associated with the service-accounts for the buckets.

Do you think it's okay to have this bucket name -> service account mapping (from line 49) to make it handier to use the script (with defaults/fallbacks for activating based on the email).

exit 1
elif [[ "$ARG0" == "-" ]]; then
echo "Unsetting impersonation"
gcloud config unset auth/impersonate_service_account
exit 0
fi

GCLOUD_USER=$(gcloud auth list --filter="status=ACTIVE" --format="value(account)")
GCLOUD_EMAIL_POSTFIX="elastic-kibana-ci.iam.gserviceaccount.com"
GCLOUD_SA_PROXY_EMAIL="kibana-ci-sa-proxy@$GCLOUD_EMAIL_POSTFIX"


if [[ "$GCLOUD_USER" != "$GCLOUD_SA_PROXY_EMAIL" ]]; then
if [[ -x "$(command -v gcloud)" ]]; then
AUTH_RESULT=$(gcloud auth activate-service-account --key-file="$KIBANA_SERVICE_ACCOUNT_PROXY_KEY" || "FAILURE")
if [[ "$AUTH_RESULT" == "FAILURE" ]]; then
echo "Failed to activate service account $GCLOUD_SA_PROXY_EMAIL."
exit 1
else
echo "Activated service account $GCLOUD_SA_PROXY_EMAIL"
fi
else
echo "gcloud is not installed, cannot activate service account $GCLOUD_SA_PROXY_EMAIL."
exit 1
fi
fi

# Check if the arg is a service account e-mail or a bucket name
EMAIL=""
if [[ "$ARG0" =~ ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$ ]]; then
EMAIL="$ARG0"
elif [[ "$ARG0" =~ ^gs://* ]]; then
BUCKET_NAME="${ARG0:5}"
else
BUCKET_NAME="$ARG0"
fi

if [[ -z "$EMAIL" ]]; then
case "$BUCKET_NAME" in
"elastic-kibana-coverage-live")
EMAIL="kibana-ci-access-coverage@$GCLOUD_EMAIL_POSTFIX"
;;
"kibana-ci-es-snapshots-daily")
EMAIL="kibana-ci-access-es-snapshots@$GCLOUD_EMAIL_POSTFIX"
;;
"kibana-so-types-snapshots")
EMAIL="kibana-ci-access-so-snapshots@$GCLOUD_EMAIL_POSTFIX"
;;
"kibana-performance")
EMAIL="kibana-ci-access-perf-stats@$GCLOUD_EMAIL_POSTFIX"
;;
"ci-artifacts.kibana.dev")
EMAIL="kibana-ci-access-artifacts@$GCLOUD_EMAIL_POSTFIX"
;;
"headless_shell_staging")
EMAIL="kibana-ci-access-chrome-builds@$GCLOUD_EMAIL_POSTFIX"
;;
*)
EMAIL="$BUCKET_NAME@$GCLOUD_EMAIL_POSTFIX"
;;
esac
fi


echo "Impersonating $EMAIL"

# Activate the service account
gcloud config set auth/impersonate_service_account "$EMAIL"

echo "Activated service account $EMAIL"
8 changes: 5 additions & 3 deletions .buildkite/scripts/common/setup_bazel.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

source .buildkite/scripts/common/util.sh

echo '--- Setting up bazel'

echo "[bazel] writing .bazelrc"
cat <<EOF > $KIBANA_DIR/.bazelrc
# Generated by .buildkite/scripts/common/setup_bazel.sh
Expand All @@ -27,16 +29,16 @@ if [[ "$BAZEL_CACHE_MODE" == "gcs" ]]; then

echo "[bazel] using GCS bucket: $BAZEL_BUCKET"

cat <<EOF >> $KIBANA_DIR/.bazelrc
cat <<EOF >> $KIBANA_DIR/.bazelrc
build --remote_cache=https://storage.googleapis.com/$BAZEL_BUCKET
build --google_default_credentials
build --google_credentials=$BAZEL_REMOTE_CACHE_CREDENTIALS_FILE
EOF
fi

if [[ "$BAZEL_CACHE_MODE" == "populate-local-gcs" ]]; then
echo "[bazel] enabling caching with GCS buckets for local dev"

cat <<EOF >> $KIBANA_DIR/.bazelrc
cat <<EOF >> $KIBANA_DIR/.bazelrc
build --remote_cache=https://storage.googleapis.com/kibana-local-bazel-remote-cache
build --google_credentials=$BAZEL_LOCAL_DEV_CACHE_CREDENTIALS_FILE
EOF
Expand Down
47 changes: 2 additions & 45 deletions .buildkite/scripts/common/util.sh
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
#!/usr/bin/env bash

source .buildkite/scripts/common/vault_fns.sh

is_pr() {
[[ "${GITHUB_PR_NUMBER-}" ]] && return
false
Expand Down Expand Up @@ -170,48 +172,3 @@ npm_install_global() {
download_artifact() {
retry 3 1 timeout 3m buildkite-agent artifact download "$@"
}

# TODO: remove after https://github.com/elastic/kibana-operations/issues/15 is done
if [[ "${VAULT_ADDR:-}" == *"secrets.elastic.co"* ]]; then
VAULT_PATH_PREFIX="secret/kibana-issues/dev"
VAULT_KV_PREFIX="secret/kibana-issues/dev"
IS_LEGACY_VAULT_ADDR=true
else
VAULT_PATH_PREFIX="secret/ci/elastic-kibana"
VAULT_KV_PREFIX="kv/ci-shared/kibana-deployments"
IS_LEGACY_VAULT_ADDR=false
fi
export IS_LEGACY_VAULT_ADDR

vault_get() {
key_path=$1
field=$2

fullPath="$VAULT_PATH_PREFIX/$key_path"

if [[ -z "${2:-}" || "${2:-}" =~ ^-.* ]]; then
retry 5 5 vault read "$fullPath" "${@:2}"
else
retry 5 5 vault read -field="$field" "$fullPath" "${@:3}"
fi
}

vault_set() {
key_path=$1
shift
fields=("$@")


fullPath="$VAULT_PATH_PREFIX/$key_path"

# shellcheck disable=SC2068
retry 5 5 vault write "$fullPath" ${fields[@]}
}

vault_kv_set() {
kv_path=$1
shift
fields=("$@")

vault kv put "$VAULT_KV_PREFIX/$kv_path" "${fields[@]}"
}
78 changes: 78 additions & 0 deletions .buildkite/scripts/common/vault_fns.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
#!/bin/bash

# TODO: remove after https://github.com/elastic/kibana-operations/issues/15 is done
if [[ "${VAULT_ADDR:-}" == *"secrets.elastic.co"* ]]; then
VAULT_PATH_PREFIX="secret/kibana-issues/dev"
VAULT_KV_PREFIX="secret/kibana-issues/dev"
IS_LEGACY_VAULT_ADDR=true
else
VAULT_PATH_PREFIX="secret/ci/elastic-kibana"
VAULT_KV_PREFIX="kv/ci-shared/kibana-deployments"
IS_LEGACY_VAULT_ADDR=false
fi
export IS_LEGACY_VAULT_ADDR

print_stack() {
local i=0
local FRAMES=${#BASH_LINENO[@]}
for ((i=1; i < FRAMES; i++)); do
local frame=${BASH_LINENO[i]}
local source=${BASH_SOURCE[i]}
echo " at $source:$frame" >&2
done
}

retry() {
local retries=$1; shift
local delay=$1; shift
local attempts=1

until "$@"; do
retry_exit_status=$?
echo "Exited with $retry_exit_status" >&2
if (( retries == "0" )); then
return $retry_exit_status
elif (( attempts == retries )); then
echo "Failed $attempts retries" >&2
print_stack
return $retry_exit_status
else
echo "Retrying $((retries - attempts)) more times..." >&2
attempts=$((attempts + 1))
sleep "$delay"
fi
done
}

vault_get() {
key_path=${1:-}
field=${2:-}

fullPath="$VAULT_PATH_PREFIX/$key_path"

if [[ -z "$field" || "$field" =~ ^-.* ]]; then
retry 5 5 vault read "$fullPath" "${@:2}"
else
retry 5 5 vault read -field="$field" "$fullPath" "${@:3}"
fi
}

vault_set() {
key_path=$1
shift
fields=("$@")


fullPath="$VAULT_PATH_PREFIX/$key_path"

# shellcheck disable=SC2068
retry 5 5 vault write "$fullPath" ${fields[@]}
}

vault_kv_set() {
kv_path=$1
shift
fields=("$@")

vault kv put "$VAULT_KV_PREFIX/$kv_path" "${fields[@]}"
}
10 changes: 10 additions & 0 deletions .buildkite/scripts/lifecycle/pre_command.sh
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,16 @@ BAZEL_LOCAL_DEV_CACHE_CREDENTIALS_FILE="$HOME/.kibana-ci-bazel-remote-cache-loca
export BAZEL_LOCAL_DEV_CACHE_CREDENTIALS_FILE
vault_get kibana-ci-bazel-remote-cache-local-dev service_account_json > "$BAZEL_LOCAL_DEV_CACHE_CREDENTIALS_FILE"

# Export key for accessing bazel remote cache's GCS bucket
BAZEL_REMOTE_CACHE_CREDENTIALS_FILE="$HOME/.kibana-ci-bazel-remote-cache-gcs.json"
export BAZEL_REMOTE_CACHE_CREDENTIALS_FILE
vault_get kibana-ci-bazel-remote-cache-sa-key key | base64 -d > "$BAZEL_REMOTE_CACHE_CREDENTIALS_FILE"

# Setup GCS Service Account Proxy for CI
KIBANA_SERVICE_ACCOUNT_PROXY_KEY="$(mktemp -d)/kibana-gcloud-service-account.json"
export KIBANA_SERVICE_ACCOUNT_PROXY_KEY
vault_get kibana-ci-sa-proxy-key key | base64 -d > "$KIBANA_SERVICE_ACCOUNT_PROXY_KEY"

PIPELINE_PRE_COMMAND=${PIPELINE_PRE_COMMAND:-".buildkite/scripts/lifecycle/pipelines/$BUILDKITE_PIPELINE_SLUG/pre_command.sh"}
if [[ -f "$PIPELINE_PRE_COMMAND" ]]; then
source "$PIPELINE_PRE_COMMAND"
Expand Down
7 changes: 4 additions & 3 deletions .buildkite/scripts/steps/archive_so_migration_snapshot.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,16 @@ set -euo pipefail

.buildkite/scripts/bootstrap.sh

SO_MIGRATIONS_SNAPSHOT_FOLDER=kibana-so-types-snapshots
SO_MIGRATIONS_SNAPSHOT_BUCKET="gs://kibana-so-types-snapshots"
SNAPSHOT_FILE_PATH="${1:-target/plugin_so_types_snapshot.json}"

echo "--- Creating snapshot of Saved Object migration info"
node scripts/snapshot_plugin_types snapshot --outputPath "$SNAPSHOT_FILE_PATH"

echo "--- Uploading as ${BUILDKITE_COMMIT}.json"
SNAPSHOT_PATH="${SO_MIGRATIONS_SNAPSHOT_FOLDER}/${BUILDKITE_COMMIT}.json"
gsutil cp "$SNAPSHOT_FILE_PATH" "gs://$SNAPSHOT_PATH"
SNAPSHOT_PATH="${SO_MIGRATIONS_SNAPSHOT_BUCKET}/${BUILDKITE_COMMIT}.json"
.buildkite/scripts/common/activate_service_account.sh "$SO_MIGRATIONS_SNAPSHOT_BUCKET"
gsutil cp "$SNAPSHOT_FILE_PATH" "$SNAPSHOT_PATH"

buildkite-agent annotate --context so_migration_snapshot --style success \
'Saved Object type snapshot is available at <a href="https://storage.cloud.google.com/'"$SNAPSHOT_PATH"'">'"$SNAPSHOT_PATH"'</a>'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ set -euo pipefail
gsutil -m cp -r gs://elastic-bekitzur-kibana-coverage-live/previous_pointer/previous.txt . || echo "### Previous Pointer NOT FOUND?"

# TODO: Activate after the above is removed
#.buildkite/scripts/common/activate_service_account.sh gs://elastic-kibana-coverage-live
#gsutil -m cp -r gs://elastic-kibana-coverage-live/previous_pointer/previous.txt . || echo "### Previous Pointer NOT FOUND?"

if [ -e ./previous.txt ]; then
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,5 @@ collectPrevious
# TODO: Safe to remove this after 2024-03-01 (https://github.com/elastic/kibana/issues/175904)
gsutil cp previous.txt gs://elastic-bekitzur-kibana-coverage-live/previous_pointer/

.buildkite/scripts/common/activate_service_account.sh gs://elastic-kibana-coverage-live
gsutil cp previous.txt gs://elastic-kibana-coverage-live/previous_pointer/
Original file line number Diff line number Diff line change
Expand Up @@ -27,5 +27,6 @@ uploadRest() {

echo "--- Uploading static site"

.buildkite/scripts/common/activate_service_account.sh gs://elastic-kibana-coverage-live
uploadBase
uploadRest
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,6 @@ source .buildkite/scripts/common/util.sh
BASE_ES_SERVERLESS_REPO=docker.elastic.co/elasticsearch-ci/elasticsearch-serverless
TARGET_IMAGE=docker.elastic.co/kibana-ci/elasticsearch-serverless:latest-verified

ES_SERVERLESS_BUCKET=kibana-ci-es-serverless-images
MANIFEST_FILE_NAME=latest-verified.json

SOURCE_IMAGE_OR_TAG=$1
if [[ $SOURCE_IMAGE_OR_TAG =~ :[a-zA-Z_-]+$ ]]; then
# $SOURCE_IMAGE_OR_TAG was a full image
Expand Down Expand Up @@ -67,42 +64,11 @@ docker logout docker.elastic.co
echo "Image push to $TARGET_IMAGE successful."
echo "Promotion successful! Henceforth, thou shall be named Sir $TARGET_IMAGE"

MANIFEST_UPLOAD_PATH="Skipped"
if [[ "${PUBLISH_MANIFEST:-}" =~ ^(1|true)$ && "$SOURCE_IMAGE_OR_TAG" =~ ^git-[0-9a-fA-F]{12}$ ]]; then
echo "--- Uploading latest-verified manifest to GCS"
cat << EOT >> $MANIFEST_FILE_NAME
{
"build_url": "$BUILDKITE_BUILD_URL",
"kibana_commit": "$BUILDKITE_COMMIT",
"kibana_branch": "$BUILDKITE_BRANCH",
"elasticsearch_serverless_tag": "$SOURCE_IMAGE_OR_TAG",
"elasticsearch_serverless_image_url": "$SOURCE_IMAGE",
"elasticsearch_serverless_commit": "TODO: this currently can't be decided",
"elasticsearch_commit": "$ELASTIC_COMMIT_HASH",
"created_at": "`date`",
"timestamp": "`FORCE_COLOR=0 node -p 'Date.now()'`"
}
EOT

gsutil -h "Cache-Control:no-cache, max-age=0, no-transform" \
cp $MANIFEST_FILE_NAME "gs://$ES_SERVERLESS_BUCKET/$MANIFEST_FILE_NAME"
gsutil acl ch -u AllUsers:R "gs://$ES_SERVERLESS_BUCKET/$MANIFEST_FILE_NAME"
MANIFEST_UPLOAD_PATH="<a href=\"https://storage.googleapis.com/$ES_SERVERLESS_BUCKET/$MANIFEST_FILE_NAME\">$MANIFEST_FILE_NAME</a>"

elif [[ "${PUBLISH_MANIFEST:-}" =~ ^(1|true)$ ]]; then
echo "--- Skipping upload of latest-verified manifest to GCS, ES Serverless build tag is not pointing to a hash"
elif [[ "$SOURCE_IMAGE_OR_TAG" =~ ^git-[0-9a-fA-F]{12}$ ]]; then
echo "--- Skipping upload of latest-verified manifest to GCS, flag was not provided"
else
echo "--- Skipping upload of latest-verified manifest to GCS, no flag and hash provided"
fi

echo "--- Annotating build with info"
cat << EOT | buildkite-agent annotate --style "success"
<h2>Promotion successful!</h2>
<br/>New image: $TARGET_IMAGE
<br/>Source image: $SOURCE_IMAGE
<br/>Kibana commit: <a href="https://github.com/elastic/kibana/commit/$BUILDKITE_COMMIT">$BUILDKITE_COMMIT</a>
<br/>Elasticsearch commit: <a href="https://github.com/elastic/elasticsearch/commit/$ELASTIC_COMMIT_HASH">$ELASTIC_COMMIT_HASH</a>
<br/>Manifest file: $MANIFEST_UPLOAD_PATH
EOT
1 change: 1 addition & 0 deletions .buildkite/scripts/steps/es_snapshots/create_manifest.ts
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ interface ManifestEntry {
set -euo pipefail

echo '--- Upload files to GCS'
.buildkite/scripts/common/activate_service_account.sh ${BASE_BUCKET_DAILY}
cd "${destination}"
gsutil -m cp -r *.* gs://${BASE_BUCKET_DAILY}/${DESTINATION}
cp manifest.json manifest-latest.json
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ import { BASE_BUCKET_DAILY, BASE_BUCKET_PERMANENT } from './bucket_config';
execSync(
`
set -euo pipefail
.buildkite/scripts/common/activate_service_account.sh ${bucket}
cp manifest.json manifest-latest-verified.json
gsutil -h "Cache-Control:no-cache, max-age=0, no-transform" cp manifest-latest-verified.json gs://${BASE_BUCKET_DAILY}/${version}/
rm manifest.json
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,9 @@ download_artifact kibana-default-plugins.tar.gz "${OUTPUT_DIR}/" --build "${KIBA
echo "--- Adding commit info"
echo "${BUILDKITE_COMMIT}" > "${OUTPUT_DIR}/KIBANA_COMMIT_HASH"

echo "--- Activating service-account for gsutil to access gs://kibana-performance"
.buildkite/scripts/common/activate_service_account.sh gs://kibana-performance

echo "--- Uploading ${OUTPUT_REL} dir to ${GCS_BUCKET}"
cd "${OUTPUT_DIR}/.."
gsutil -m cp -r "${BUILD_ID}" "${GCS_BUCKET}"
Expand Down
3 changes: 3 additions & 0 deletions .buildkite/scripts/steps/scalability/benchmarking.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@ rm -rf "${KIBANA_LOAD_TESTING_DIR}"
rm -rf "${GCS_ARTIFACTS_DIR}"

download_artifacts() {
echo Activating service-account for gsutil to access gs://kibana-performance
.buildkite/scripts/common/activate_service_account.sh gs://kibana-performance

mkdir -p "${GCS_ARTIFACTS_DIR}"

gsutil cp "$GCS_BUCKET/latest" "${GCS_ARTIFACTS_DIR}/"
Expand Down
Loading