Skip to content

[bitnami/airflow] The Arflow setup-db-job is never created by argocd #32804

@sondn98

Description

@sondn98

Name and Version

bitnami/airflow 22.7.2

What architecture are you using?

arm64

What steps will reproduce the bug?

I'm deploying Apache Airflow using ArgoCD, both via the Bitnami Helm charts. All Airflow resources are deployed successfully, except for the setup-db-job, which is responsible for initializing the Airflow database. The job is not even created.

After investigating, I found that the default annotations helm.sh/hook and helm.sh/hook-delete-policy prevent the job from being created due to ArgoCD's hook management mechanism. Unfortunately, there is no option to remove these annotations via the values.yaml file.

Would it be a valid change request to allow the removal of these annotations in the setup-db-job template? We could leave the default values for the job annotations in the values.yaml file, along with a note explaining the behavior. This would allow users to customize or disable the annotations if needed. This would improve compatibility with ArgoCD and other deployment workflows.

Are you using any custom parameters or values?

airflow:
  auth:
    username: airflow
    existingSecret: airflow-auth-secret

  executor: CeleryExecutor
  loadExamples: true

  image:
    registry: docker.io
    repository: bitnami/airflow
    tag: 2.10.5-debian-12-r7

  dags:
    enabled: false

  plugins:
    enabled: false

  defaultInitContainers:
    prepareConfig:
      resourcesPreset: "none"
      resources: {}

    waitForDBMigrations:
      resourcesPreset: "none"
      resources: {}

    loadDAGsPlugins:
      resourcesPreset: "none"
      resources: {}

  web:
    baseUrl: airflow.fabricator.work:443
    containerPorts:
      http: 8080
    replicaCount: 1
    resourcesPreset: "none"
    resources: {}

  scheduler:
    replicaCount: 1
    resourcesPreset: "none"
    resources: {}

  dagProcessor:
    enabled: false

  triggerer:
    enabled: false

  worker:
    containerPorts:
      http: 8793
    replicaCount: 1
    resourcesPreset: "none"
    resources: {}

  setupDBJob:
    enabled: true
    backoffLimit: 10
    resourcesPreset: "none"
    resources: {}

  service:
    type: ClusterIP
    ports:
      http: 8080

  ingress:
    enabled: false

  serviceAccount:
    create: true
  rbac:
    create: false

  metrics:
    enabled: true
    image:
      registry: docker.io
      repository: bitnami/statsd-exporter
      tag: 0.28.0-debian-12-r11
    existingConfigmap: "airflow-statsd-mapping-config"
    containerPorts:
      ingest: 9125
      metrics: 9102
    resourcesPreset: "none"
    resources: {}
    service:
      ports:
        ingest: 9125
        metrics: 9102
    serviceMonitor:
      enabled: true
      namespace: "prometheus"
      interval: "30s"
      scrapeTimeout: "30s"
      honorLabels: true

  postgresql:
    enabled: false

  externalDatabase:
    host: postgresql.postgresql.svc.cluster.local
    port: 5432
    user: airflow
    database: airflow
    existingSecret: airflow-pg-secret
    existingSecretPasswordKey: airflow-pg-password

  redis:
    enabled: true
    auth:
      enabled: true
      existingSecret: "airflow-redis-secret"
    architecture: standalone
    master:
      service:
        ports:
          redis: 6379
      resourcesPreset: "none"
      resources: {}

What is the expected behavior?

No response

What do you see instead?

The setup-db-job is not created by Argocd

Additional information

No response

Metadata

Metadata

Assignees

Labels

airflowsolvedstale15 days without activitytech-issuesThe user has a technical issue about an application

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions