Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add vendor label to deployed third-party resources #9

Merged
merged 8 commits into from
Feb 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,8 @@ data:
kind: SparkApplication
metadata:
name: spark-ingest-into-lakehouse
labels:
stackable.tech/vendor: Stackable
spec:
version: "1.0"
sparkImage:
Expand All @@ -61,7 +63,7 @@ data:
# Every merge into statements creates 8 files.
# Paralleling is enough for the demo, might need to be increased (or omitted entirely) when merge larger data volumes
spark.sql.shuffle.partitions: "8"

# As of 2023-10-31 the operator does not set this
spark.executor.cores: "4"
volumes:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,6 @@ data:
# with open("superset-assets.zip", "wb") as f:
# f.write(result.content)


#########################
# IMPORTANT
#########################
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ spec:
- name: ingestion-job-spec
mountPath: /tmp/ingestion-job-spec
volumes:
- name: ingestion-job-spec
configMap:
name: create-druid-ingestion-job-spec
- name: ingestion-job-spec
Techassi marked this conversation as resolved.
Show resolved Hide resolved
configMap:
name: create-druid-ingestion-job-spec
restartPolicy: OnFailure
backoffLimit: 50
---
Expand Down
1 change: 0 additions & 1 deletion demos/signal-processing/serviceaccount.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -45,4 +45,3 @@ rules:
- pods/exec
verbs:
- create

Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ kind: ConfigMap
metadata:
name: cm-spark
data:
spark-ad.py: |
spark-ad.py: |-
from pyspark.sql import SparkSession
from pyspark.sql.functions import dayofweek, to_date, to_timestamp, date_format, year, hour, minute, month, when, dayofmonth, dayofweek
from pyspark.sql.functions import concat_ws, substring, concat, lpad, lit
Expand Down Expand Up @@ -191,4 +191,4 @@ data:
)

# write via iceberg
df_out.writeTo("prediction.ad.iforest").append()
df_out.writeTo("prediction.ad.iforest").append()
4 changes: 4 additions & 0 deletions stacks/_templates/jupyterhub.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ options:
password: {{ jupyterHubAdminPassword }}
JupyterHub:
authenticator_class: dummy
labels:
stackable.tech/vendor: Stackable
proxy:
service:
type: NodePort
Expand Down Expand Up @@ -41,6 +43,8 @@ options:
serviceAccountName: spark
networkPolicy:
enabled: false
extraLabels:
stackable.tech/vendor: Stackable
extraEnv:
HADOOP_CONF_DIR: "/home/jovyan/hdfs"
initContainers:
Expand Down
4 changes: 4 additions & 0 deletions stacks/_templates/minio-distributed-small.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ repo:
url: https://charts.min.io/
version: 5.0.14
options:
additionalLabels:
stackable.tech/vendor: Stackable
podLabels:
stackable.tech/vendor: Stackable
rootUser: admin
rootPassword: {{ minioAdminPassword }}
mode: distributed
Expand Down
4 changes: 4 additions & 0 deletions stacks/_templates/minio-distributed.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ repo:
url: https://charts.min.io/
version: 5.0.14
options:
additionalLabels:
stackable.tech/vendor: Stackable
podLabels:
stackable.tech/vendor: Stackable
rootUser: admin
rootPassword: {{ minioAdminPassword }}
mode: distributed
Expand Down
4 changes: 4 additions & 0 deletions stacks/_templates/minio.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ repo:
url: https://charts.min.io/
version: 5.0.14
options:
additionalLabels:
stackable.tech/vendor: Stackable
podLabels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
rootUser: admin
rootPassword: {{ minioAdminPassword }}
mode: standalone
Expand Down
4 changes: 4 additions & 0 deletions stacks/_templates/opensearch-dashboards.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,17 @@ repo:
url: https://opensearch-project.github.io/helm-charts
version: 2.14.0
options:
labels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
service:
type: NodePort
port: 5601
annotations:
stackable.tech/logging-view-logs: |-
/app/discover?security_tenant=global#/view/logs
stackable.tech/logging-credentials-secret: opensearch-user
labels:
stackable.tech/vendor: Stackable
opensearchAccount:
secret: opensearch-dashboard-user
extraEnvs:
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/opensearch.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://opensearch-project.github.io/helm-charts
version: 2.16.1
options:
labels:
stackable.tech/vendor: Stackable
config:
opensearch.yml: |
plugins:
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-airflow.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: airflow
password: airflow
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-druid.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: druid
password: druid
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-hive-iceberg.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: hive
password: hive
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-hive.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: hive
password: hive
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-hivehdfs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: hive
password: hive
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-hives3.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: hive
password: hive
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-superset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 13.2.18
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
username: superset
password: superset
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/postgresql-timescaledb.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.timescale.com/
version: 0.33.1
options:
podLabels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
replicaCount: 1
secrets:
credentials:
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/prometheus-service-monitor.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ kind: ServiceMonitor
metadata:
name: scrape-label
labels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
release: prometheus
spec:
endpoints:
Expand All @@ -19,6 +20,7 @@ kind: ServiceMonitor
metadata:
name: scrape-minio
labels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
release: prometheus
spec:
endpoints:
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/prometheus.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://prometheus-community.github.io/helm-charts
version: 54.2.2
options:
commonMetaLabels:
stackable.tech/vendor: Stackable
adwk67 marked this conversation as resolved.
Show resolved Hide resolved
prometheus:
prometheusSpec:
storageSpec:
Expand Down
2 changes: 2 additions & 0 deletions stacks/_templates/redis-airflow.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://charts.bitnami.com/bitnami/
version: 18.1.6
options:
commonLabels:
stackable.tech/vendor: Stackable
auth:
password: airflow
replica:
Expand Down
4 changes: 4 additions & 0 deletions stacks/_templates/vector-aggregator.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ repo:
url: https://helm.vector.dev
version: 0.26.0
options:
commonLabels:
stackable.tech/vendor: Stackable
podLabels: # Doesn't seem to work?
stackable.tech/vendor: Stackable
role: Aggregator
customConfig:
sources:
Expand Down
2 changes: 1 addition & 1 deletion stacks/jupyterhub-pyspark-hdfs/hdfs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -44,4 +44,4 @@ spec:
capacity: 5Gi
roleGroups:
default:
replicas: 1
replicas: 1
2 changes: 1 addition & 1 deletion stacks/jupyterhub-pyspark-hdfs/spark_driver_service.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ spec:
port: 4040
protocol: TCP
targetPort: 4040
type: ClusterIP
type: ClusterIP
4 changes: 2 additions & 2 deletions stacks/keycloak-opa-poc/druid.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ spec:
# it seems like the Druid processes cannot handle the OIDC authentication flow.
druid.auth.authenticator.MyBasicMetadataAuthenticator.type: basic
druid.auth.authenticator.MyBasicMetadataAuthenticator.initialInternalClientPassword: '${env:DRUID_SYSTEM_USER_PASSWORD}' # Default password for internal 'druid_system' user
druid.auth.authenticator.MyBasicMetadataAuthenticator.skipOnFailure: "true" # for any non system user, skip to the pac4j authenticator
druid.auth.authenticator.MyBasicMetadataAuthenticator.skipOnFailure: "true" # for any non system user, skip to the pac4j authenticator
druid.auth.authenticator.MyBasicMetadataAuthenticator.authorizerName: OpaAuthorizer

# pac4j authenticator
Expand All @@ -84,7 +84,7 @@ spec:
druid.auth.pac4j.oidc.clientSecret: '{"type":"environment","variable":"DRUID_CLIENT_SECRET"}'
druid.auth.pac4j.oidc.discoveryURI: '${env:KEYCLOAK_DISCOVERY_URL}'
# druid.auth.pac4j.oidc.oidcClaim: preferred_username # setting doesn't work, but should?

druid.auth.authenticatorChain: '["MyBasicMetadataAuthenticator","pac4j"]'

druid.escalator.type: basic
Expand Down
2 changes: 1 addition & 1 deletion stacks/keycloak-opa-poc/hdfs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@ spec:
journalNodes:
roleGroups:
default:
replicas: 1
replicas: 1
2 changes: 1 addition & 1 deletion stacks/keycloak-opa-poc/opa.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,4 +8,4 @@ spec:
productVersion: 0.57.0
servers:
roleGroups:
default: {}
default: {}
2 changes: 1 addition & 1 deletion stacks/keycloak-opa-poc/zookeeper.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ metadata:
name: hdfs-znode
spec:
clusterRef:
name: zk
name: zk
2 changes: 2 additions & 0 deletions stacks/signal-processing/grafana.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ repo:
url: https://grafana.github.io/helm-charts
version: 6.56.1
options:
extraLabels:
stackable.tech/vendor: Stackable
grafana.ini:
analytics:
check_for_updates: false
Expand Down
4 changes: 4 additions & 0 deletions stacks/signal-processing/jupyterhub.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ options:
password: {{ jupyterHubAdminPassword }}
JupyterHub:
authenticator_class: dummy
labels:
stackable.tech/vendor: Stackable
proxy:
service:
type: NodePort
Expand All @@ -32,6 +34,8 @@ options:
serviceAccountName: spark
networkPolicy:
enabled: false
extraLabels:
stackable.tech/vendor: Stackable
initContainers:
- name: download-notebook
image: docker.stackable.tech/stackable/tools:1.0.0-stackable23.11.0
Expand Down
2 changes: 1 addition & 1 deletion stacks/signal-processing/secrets.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,4 @@ metadata:
name: timescale-admin-credentials
stringData:
username: admin
password: {{ postgresAdminPassword }}
password: {{ postgresAdminPassword }}