Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ooops! error. DAG did not load #40917

Closed
1 of 2 tasks
AadarshBhalerao opened this issue Jul 21, 2024 · 2 comments
Closed
1 of 2 tasks

Ooops! error. DAG did not load #40917

AadarshBhalerao opened this issue Jul 21, 2024 · 2 comments
Labels
area:core area:webserver Webserver related Issues kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet

Comments

@AadarshBhalerao
Copy link

Apache Airflow version

Other Airflow 2 version (please specify below)

If "Other Airflow 2 version" selected, which one?

2.9.2

What happened?

I am using airflow with docker. Below mentioned are my files

docker-compose.yml

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#

---
version: '3'
x-airflow-common:
  &airflow-common
  image: apache/airflow:2.9.3
  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: LocalExecutor
    AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
    AIRFLOW__CORE__FERNET_KEY: FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
    AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.basic_auth'
    AIRFLOW__SCHEDULER__ENABLE_HEALTH_CHECK: 'true'
  volumes:
    - ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
    - ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs
    - ${AIRFLOW_PROJ_DIR:-.}/config:/opt/airflow/config
    - ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins
    - ./requirements.txt:/requirements.txt
  # user: "${AIRFLOW_UID:-50000}:0"
  depends_on:
    &airflow-common-depends-on
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:13
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 10s
      retries: 5
      start_period: 5s
    restart: always

  airflow-webserver:
    <<: *airflow-common
    command: bash -c "pip install -r /requirements.txt && airflow webserver"
    ports:
      - "8080:8080"
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 30s
      timeout: 10s
      retries: 5
      start_period: 30s
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-scheduler:
    <<: *airflow-common
    command: scheduler
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8974/health"]
      interval: 30s
      timeout: 10s
      retries: 5
      start_period: 30s
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

  airflow-init:
    <<: *airflow-common
    entrypoint: /bin/bash
    # yamllint disable rule:line-length
    command:
      - -c
      - airflow db init &&
        airflow users create
          --role Admin
          --username airflow
          --password airflow
          --email airflow@airflow.com
          --firstname airflow
          --lastname airflow 
    # yamllint enable rule:line-length
    restart: on-failure

volumes:
  postgres-db-volume:

my dag.py

import logging
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime

def get_input(ti, **kwargs):
  logging.info("Fetching inputs from config")
  conf = kwargs['dag_run'].conf
  try:
      value_1 = float(conf.get('value_1'))
      value_2 = float(conf.get('value_2'))
      power_n = int(conf.get('power_n'))
  except (TypeError, ValueError) as e:
      logging.error(f"Invalid input values: {e}")
      raise

  ti.xcom_push(key="value_1", value=value_1)
  ti.xcom_push(key="value_2", value=value_2)
  ti.xcom_push(key="power_n", value=power_n)
  logging.info(f"Received values: {value_1}, {value_2}, and power {power_n}")

def calculate_sum(ti, **kwargs):
  logging.info("Calculating sum")
  value_1 = ti.xcom_pull(key="value_1", task_ids="get_input")
  value_2 = ti.xcom_pull(key="value_2", task_ids="get_input")
  result = value_1 + value_2
  ti.xcom_push(key='sum', value=result)
  logging.info(f'Sum of {value_1} and {value_2} is {result}')

def calculate_subtraction(ti, **kwargs):
  logging.info("Calculating subtraction")
  value_1 = ti.xcom_pull(key="value_1", task_ids="get_input")
  value_2 = ti.xcom_pull(key="value_2", task_ids="get_input")
  result = value_1 - value_2
  ti.xcom_push(key="subtract", value=result)
  logging.info(f'{value_1} - {value_2} is {result}')

def calculate_multiplication_and_power(ti, **kwargs):
  logging.info("Calculating multiplication and power")
  sum_value = ti.xcom_pull(key="sum", task_ids="calculate_sum")
  subtract_value = ti.xcom_pull(key="subtract", task_ids="calculate_subtraction")
  power_n = ti.xcom_pull(key="power_n", task_ids="get_input")
  multiplication_result = sum_value * subtract_value
  power_result = multiplication_result ** power_n
  ti.xcom_push(key='power_result', value=power_result)
  logging.info(f'({sum_value} * {subtract_value}) ^ {power_n} is {power_result}')

default_args = {
  'start_date': datetime(2024, 1, 1),
}

with DAG(
  dag_id='parallel_pipeline_dag_with_power',
  default_args=default_args,
  schedule_interval=None
) as dag:

  task1 = PythonOperator(
      task_id='get_input',
      python_callable=get_input,
  )

  task2 = PythonOperator(
      task_id='calculate_sum',
      python_callable=calculate_sum,
  )

  task3 = PythonOperator(
      task_id='calculate_subtraction',
      python_callable=calculate_subtraction,
  )

  task4 = PythonOperator(
      task_id='calculate_multiplication_and_power',
      python_callable=calculate_multiplication_and_power,
  )

  task1 >> [task2, task3] >> task4

once done. I entered docker-compose up -d

Every container started properly. When I opened the webserver using localhost, I was able to login via my set creds in .yml file. My dag is also present in the listing page. but when I click on my dag, I get error

ON UI:

Ooops!
Something bad has happened. For security reasons detailed information about the error is not logged.

  * You should check your webserver logs and retrieve details of this error.

  * When you get the logs, it might explain the reasons, you should also Look for similar issues using:

     * [GitHub Discussions](https://github.com/apache/airflow/discussions)
     * [GitHub Issues](https://github.com/apache/airflow/issues)
     * [Stack Overflow](https://stackoverflow.com/questions/tagged/airflow)
     * the usual search engine you use on a daily basis

    All those resources might help you to find a solution to your problem.

  * if you run Airflow on a Managed Service, consider opening an issue using the service support channels

  * only after you tried it all, and have difficulty with diagnosing and fixing the problem yourself,
    get the logs with errors, describe results of your investigation so far, and consider creating a
    [bug report](https://github.com/apache/airflow/issues/new/choose) including this information.

Python version: redacted
Airflow version: redacted
Node: redacted
-------------------------------------------------------------------------------
Error! Please contact server admin.

And on airflow-webserver-1 (container)

2024-07-21 23:56:52 [2024-07-21T18:26:52.251+0000] {app.py:1744} ERROR - Exception on /dags/parallel_pipeline_dag_with_power/grid [GET]
2024-07-21 23:56:52 Traceback (most recent call last):
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/flask/app.py", line 2529, in wsgi_app
2024-07-21 23:56:52     response = self.full_dispatch_request()
2024-07-21 23:56:52                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/flask/app.py", line 1825, in full_dispatch_request
2024-07-21 23:56:52     rv = self.handle_user_exception(e)
2024-07-21 23:56:52          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/flask/app.py", line 1823, in full_dispatch_request
2024-07-21 23:56:52     rv = self.dispatch_request()
2024-07-21 23:56:52          ^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/flask/app.py", line 1799, in dispatch_request
2024-07-21 23:56:52     return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
2024-07-21 23:56:52            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/www/auth.py", line 244, in decorated
2024-07-21 23:56:52     is_authorized = get_auth_manager().is_authorized_dag(
2024-07-21 23:56:52                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/fab/auth_manager/fab_auth_manager.py", line 238, in is_authorized_dag
2024-07-21 23:56:52     if (details and details.id) and not self._is_authorized_dag(
2024-07-21 23:56:52                                         ^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/fab/auth_manager/fab_auth_manager.py", line 409, in _is_authorized_dag
2024-07-21 23:56:52     is_global_authorized = self._is_authorized(method=method, resource_type=RESOURCE_DAG, user=user)
2024-07-21 23:56:52                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/fab/auth_manager/fab_auth_manager.py", line 390, in _is_authorized
2024-07-21 23:56:52     user_permissions = self._get_user_permissions(user)
2024-07-21 23:56:52                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/fab/auth_manager/fab_auth_manager.py", line 471, in _get_user_permissions
2024-07-21 23:56:52     return getattr(user, "perms") or []
2024-07-21 23:56:52            ^^^^^^^^^^^^^^^^^^^^^^
2024-07-21 23:56:52   File "/home/airflow/.local/lib/python3.12/site-packages/airflow/providers/fab/auth_manager/models/anonymous_user.py", line 48, in perms
2024-07-21 23:56:52     (perm.action.name, perm.resource.name) for role in self.roles for perm in role.permissions
2024-07-21 23:56:52                                                                               ^^^^^^^^^^^^^^^^
2024-07-21 23:56:52 AttributeError: 'NoneType' object has no attribute 'permissions'
2024-07-21 23:56:52 172.20.0.1 - - [21/Jul/2024:18:26:52 +0000] "GET /dags/parallel_pipeline_dag_with_power/grid HTTP/1.1" 500 1592 "http://localhost:8080/home?status=active" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36"

What you think should happen instead?

I should be able to access the DAG to run/test it.

How to reproduce

folder
|-dags
|-simple-dag.py
|-logs
|-plugins
|-docker-compose.yml
|-requirements.txt

content inside requirements.txt

apache-airflow==2.9.2
boto3==1.34.143

Operating System

WSL

Versions of Apache Airflow Providers

2.9.2

Deployment

Docker-Compose

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@AadarshBhalerao AadarshBhalerao added area:core kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet labels Jul 21, 2024
Copy link

boring-cyborg bot commented Jul 21, 2024

Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.

@dosubot dosubot bot added the area:webserver Webserver related Issues label Jul 21, 2024
@shahar1
Copy link
Contributor

shahar1 commented Jul 21, 2024

Converting it into a Q&A discussion as it is not indicative of a specific bug.
I'd suggest using the official docker-compose.yml for deployment, as your customized deployment seems to miss some components/env. vars, and also there are some version inconsistencies (you state Apache Airflow v2.9.2, but the Docker image is v2.9.3).

@apache apache locked and limited conversation to collaborators Jul 21, 2024
@shahar1 shahar1 converted this issue into discussion #40918 Jul 21, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
area:core area:webserver Webserver related Issues kind:bug This is a clearly a bug needs-triage label for new issues that we didn't triage yet
Projects
None yet
Development

No branches or pull requests

2 participants