Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

switch to rocky linux #4092

Merged

add requirements server

42b98cf
Select commit
Loading
Failed to load commit list.
Merged

switch to rocky linux #4092

add requirements server
42b98cf
Select commit
Loading
Failed to load commit list.
Travis CI / Travis CI - Pull Request failed May 21, 2024 in 36m 12s

Build Failed

The build failed.

Details

This is a pull request build.

It is running a build against the merge commit, after merging #4092 switch to rocky linux.
Any changes that have been made to the dev-10645-python-upgrade-3.10 branch before the build ran are also included.

Jobs and Stages

This build has 14 jobs, running in four sequential stages.

Stage 1: Build

This stage passed.

Job Python ENV OS State
18048.1 pip install 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.2 docker build 3.10 DEBIAN_FRONTEND=noninteractive Linux passed

Stage 2: Static Code Analysis

This stage passed.

Job Python ENV OS State
18048.3 flake8 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.4 black 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.5 API Docs 3.10 DEBIAN_FRONTEND=noninteractive Linux passed

Stage 3: Automated Tests

This stage failed.

Job Python ENV OS State
18048.6 Spark Integration Tests - test_load_transactions_in_delta_fabs_fpds.py 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.7 Spark Integration Tests - test_load_transactions_in_delta_lookups.py 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.8 Spark Integration Tests - test_load_to_from_delta.py 3.10 DEBIAN_FRONTEND=noninteractive Linux failed
18048.9 Spark Integration Tests - Other 3.10 DEBIAN_FRONTEND=noninteractive Linux failed
18048.10 Non-Spark Integration Tests 3.10 DEBIAN_FRONTEND=noninteractive Linux failed
18048.11 Non-Spark Integration Tests - Using Signal Handling 3.10 DEBIAN_FRONTEND=noninteractive Linux passed
18048.12 Unit Tests 3.10 DEBIAN_FRONTEND=noninteractive Linux failed
18048.13 Unit Tests - Using Signal Handling 3.10 DEBIAN_FRONTEND=noninteractive Linux passed

Stage 4: Code Coverage

This stage canceled.

Job Python ENV OS State
18048.14 3.10 DEBIAN_FRONTEND=noninteractive Linux canceled

Build Configuration

Build Option Setting
Language Python
Operating System Linux (Bionic)
Python Version 3.10
Build Configuration
{
  "language": "python",
  "os": [
    "linux"
  ],
  "dist": "bionic",
  "python": [
    "3.10"
  ],
  "cache": {
    "pip": true
  },
  "env": [
    "global={:DEBIAN_FRONTEND=>\"noninteractive\"}={:POSTGRES_HOST=>\"localhost\"}={:USASPENDING_DB_HOST=>\"localhost\"}={:USASPENDING_DB_PORT=>\"5432\"}={:USASPENDING_DB_USER=>\"usaspending\"}={:USASPENDING_DB_PASSWORD=>\"usaspender\"}={:USASPENDING_DB_NAME=>\"data_store_api\"}={:DATABASE_URL=>\"postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/${USASPENDING_DB_NAME}\"}={:DOWNLOAD_DATABASE_URL=>\"postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/${USASPENDING_DB_NAME}\"}={:DJANGO_SETTINGS_MODULE=>\"'usaspending_api.settings'\"}={:ES_SCHEME=>\"http\"}={:ES_HOST=>\"localhost\"}={:ES_PORT=>\"9200\"}={:ES_HOSTNAME=>\"${ES_SCHEME}://${ES_HOST}:${ES_PORT}\"}={:BROKER_DB_HOST=>\"localhost\"}={:BROKER_DB_PORT=>\"5432\"}={:BROKER_DB_USER=>\"admin\"}={:BROKER_DB_PASSWORD=>\"root\"}={:BROKER_DB_NAME=>\"data_broker\"}={:DATA_BROKER_DATABASE_URL=>\"postgres://${BROKER_DB_USER}:${BROKER_DB_PASSWORD}@${BROKER_DB_HOST}:${BROKER_DB_PORT}/${BROKER_DB_NAME}\"}={:DATA_BROKER_SRC_PATH=>\"\\\"${TRAVIS_BUILD_DIR}/../data-act-broker-backend\\\"\"}={:BROKER_REPO_URL=>\"https://github.com/fedspendingtransparency/data-act-broker-backend.git\"}={:BROKER_REPO_BRANCH=>\"$(if [ \\\"${TRAVIS_EVENT_TYPE}\\\" = \\\"pull_request\\\" ] && [ ! -z \\\"`git ls-remote --heads ${BROKER_REPO_URL} ${TRAVIS_BRANCH}`\\\" ]; then echo \\\"${TRAVIS_BRANCH}\\\"; else echo \\\"qat\\\"; fi;)\"}={:BROKER_REPO_FOLDER=>\"${DATA_BROKER_SRC_PATH}\"}={:BROKER_DOCKER_IMAGE=>\"dataact-broker-backend\"}={:GRANTS_API_KEY=>\"${GRANTS_API_KEY}\"}={:MINIO_DATA_DIR=>\"${HOME}/Development/data/usaspending/docker/usaspending-s3\"}={:MINIO_HOST=>\"localhost\"}={:PYTEST_XDIST_NUMPROCESSES=>\"4\"}={:COLUMNS=>\"240\"}={:TRAVIS_JOB_INDEX=>\"\\\"$(echo $TRAVIS_JOB_NUMBER | cut -d'.' -f2)\\\"\"}"
  ],
  "jobs": {
    "include": [
      {
        "stage": "Build",
        "name": "pip install",
        "workspaces": {
          "create": {
            "name": "pip",
            "paths": [
              "$HOME/virtualenv/",
              "$TRAVIS_BUILD_DIR/usaspending_api.egg_info/"
            ]
          }
        },
        "before_install": [
          ""
        ],
        "install": [
          "travis_retry pip install setuptools==65.5.0",
          "travis_retry pip install .[dev]",
          "travis_retry pip install coveralls"
        ],
        "before_script": [
          ""
        ],
        "script": [
          ""
        ]
      },
      {
        "name": "docker build",
        "workspaces": {
          "create": {
            "name": "docker",
            "paths": [
              "docker_images/"
            ]
          }
        },
        "before_install": [
          ""
        ],
        "install": [
          "echo \"Using ${BROKER_REPO_BRANCH} branch from ${BROKER_REPO_URL}\"",
          "travis_retry git clone --branch ${BROKER_REPO_BRANCH} --single-branch --depth 1 ${BROKER_REPO_URL} ${BROKER_REPO_FOLDER}",
          "docker build -t ${BROKER_DOCKER_IMAGE} ${BROKER_REPO_FOLDER}"
        ],
        "before_script": [
          ""
        ],
        "script": [
          ""
        ],
        "before_cache": [
          "mkdir -p docker_images",
          "docker save -o docker_images/${BROKER_DOCKER_IMAGE}.tar ${BROKER_DOCKER_IMAGE}"
        ]
      },
      {
        "stage": "Static Code Analysis",
        "name": "flake8",
        "workspaces": {
          "use": [
            "pip"
          ]
        },
        "before_install": [
          ""
        ],
        "install": [
          ""
        ],
        "before_script": [
          ""
        ],
        "script": [
          "flake8"
        ]
      },
      {
        "name": "black",
        "workspaces": {
          "use": [
            "pip"
          ]
        },
        "before_install": [
          ""
        ],
        "install": [
          ""
        ],
        "before_script": [
          ""
        ],
        "script": [
          "black --check --diff ."
        ]
      },
      {
        "name": "API Docs",
        "workspaces": {
          "use": [
            "pip"
          ]
        },
        "before_install": [
          ""
        ],
        "install": [
          "travis_retry npm install --global dredd@13.1.2"
        ],
        "before_script": [
          ""
        ],
        "script": [
          "python manage.py check_for_endpoint_documentation",
          "dredd > dredd-results.txt && echo '! grep -E \"^[warn:|error:]\" dredd-results.txt' | bash"
        ]
      },
      {
        "stage": "Automated Tests",
        "name": "Spark Integration Tests - test_load_transactions_in_delta_fabs_fpds.py",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "0"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "true"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": "test_load_transactions_in_delta_fabs_fpds.py"
          },
          {
            "PYTEST_MARK_EXPRESSION": "spark"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws1",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Spark Integration Tests - test_load_transactions_in_delta_lookups.py",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "0"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "true"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": "test_load_transactions_in_delta_lookups.py"
          },
          {
            "PYTEST_MARK_EXPRESSION": "spark"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws2",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Spark Integration Tests - test_load_to_from_delta.py",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "2"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "true"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": "test_load_to_from_delta.py"
          },
          {
            "PYTEST_MARK_EXPRESSION": "spark"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws3",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Spark Integration Tests - Other",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "4"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "true"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": "'(not test_load_to_from_delta.py and not test_load_transactions_in_delta_lookups.py and not test_load_transactions_in_delta_fabs_fpds.py)'"
          },
          {
            "PYTEST_MARK_EXPRESSION": "spark"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws4",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Non-Spark Integration Tests",
        "env": [
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "false"
          },
          {
            "PYTEST_INCLUDE_GLOB": "**/tests/integration/*"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": ""
          },
          {
            "PYTEST_MARK_EXPRESSION": "'(not spark and not signal_handling)'"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws5",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Non-Spark Integration Tests - Using Signal Handling",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "0"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "true"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "false"
          },
          {
            "PYTEST_INCLUDE_GLOB": "**/tests/integration/*"
          },
          {
            "PYTEST_EXCLUDE_GLOB": ""
          },
          {
            "PYTEST_MATCH_EXPRESSION": ""
          },
          {
            "PYTEST_MARK_EXPRESSION": "'(signal_handling and not spark)'"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws6",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Unit Tests",
        "env": [
          {
            "PYTEST_SETUP_TEST_DATABASES": "false"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "false"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": "**/tests/integration/*"
          },
          {
            "PYTEST_MATCH_EXPRESSION": ""
          },
          {
            "PYTEST_MARK_EXPRESSION": "'(not spark and not database and not elasticsearch and not signal_handling)'"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws7",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "name": "Unit Tests - Using Signal Handling",
        "env": [
          {
            "PYTEST_XDIST_NUMPROCESSES": "0"
          },
          {
            "PYTEST_SETUP_TEST_DATABASES": "false"
          },
          {
            "PYTEST_PRELOAD_SPARK_JARS": "false"
          },
          {
            "PYTEST_INCLUDE_GLOB": "'test_*.py *_test.py'"
          },
          {
            "PYTEST_EXCLUDE_GLOB": "**/tests/integration/*"
          },
          {
            "PYTEST_MATCH_EXPRESSION": ""
          },
          {
            "PYTEST_MARK_EXPRESSION": "'(signal_handling)'"
          }
        ],
        "workspaces": {
          "use": [
            "pip",
            "docker"
          ],
          "create": {
            "name": "ws8",
            "paths": [
              "coverage.*.xml"
            ]
          }
        }
      },
      {
        "stage": "Code Coverage",
        "env": [
          {
            "IS_CODE_COVERAGE_REPORT": "true"
          }
        ],
        "workspaces": {
          "use": [
            "ws1",
            "ws2",
            "ws3",
            "ws4",
            "ws5",
            "ws6",
            "ws7",
            "ws8"
          ]
        },
        "before_install": [
          ""
        ],
        "install": [
          ""
        ],
        "before_script": [
          "travis_retry curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter",
          "chmod +x ./cc-test-reporter"
        ],
        "script": [
          "ls -lh coverage*",
          "for cf in coverage.*.xml; do ./cc-test-reporter format-coverage --prefix $TRAVIS_BUILD_DIR --input-type coverage.py --output coverage/codeclimate.$(echo \"$cf\" | cut -d'.' -f2).xml coverage.$(echo \"$cf\" | cut -d'.' -f2).xml; done",
          "ls coverage/",
          "./cc-test-reporter sum-coverage --output - --parts $(find . -maxdepth 1 -name 'coverage.*.xml' | wc -l) ./coverage/codeclimate.*.xml | ./cc-test-reporter upload-coverage --input -"
        ]
      }
    ]
  },
  "before_install": [
    "docker load -i docker_images/*.tar || true"
  ],
  "install": [
    "echo \"Using ${BROKER_REPO_BRANCH} branch from ${BROKER_REPO_URL}\"",
    "travis_retry git clone --branch ${BROKER_REPO_BRANCH} --single-branch --depth 1 ${BROKER_REPO_URL} ${BROKER_REPO_FOLDER}"
  ],
  "before_script": [
    "travis_retry curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter",
    "chmod +x ./cc-test-reporter",
    "./cc-test-reporter before-build",
    "make docker-compose-up-usaspending args=\"-d usaspending-db usaspending-es\"",
    "ttl=30; echo \"Try DB conn from container for $ttl seconds\"; until [ $ttl -le 0 ] || psql $DATABASE_URL -c 'select 1 where 1=1'; do echo $ttl; ((ttl--)); sleep 1; done; [ $ttl -gt 0 ]",
    "ttl=30; echo \"Try ES conn from container for $ttl seconds\"; until [ $ttl -le 0 ] || curl --silent -XGET --fail $ES_HOSTNAME; do echo $ttl; ((ttl--)); sleep 1; done; [ $ttl -gt 0 ]",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"ALTER USER ${USASPENDING_DB_USER} SET search_path TO public,raw,int,temp,rpt\"",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"CREATE USER ${BROKER_DB_USER} PASSWORD '${BROKER_DB_PASSWORD}' SUPERUSER\"",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"CREATE ROLE readonly;\"",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"\\copy (\n    SELECT\n      'GRANT USAGE ON SCHEMA ' || nspname || ' TO readonly; '\n    || 'GRANT SELECT ON ALL TABLES IN SCHEMA ' || nspname || ' TO readonly; '\n    || 'ALTER DEFAULT PRIVILEGES IN SCHEMA ' || nspname || ' GRANT SELECT ON TABLES TO readonly; '\n    FROM pg_namespace WHERE nspname IN ('raw','int','rpt','temp','public')\n  ) TO grant_to_readonly.sql;\"\n",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"\\i grant_to_readonly.sql\"",
    "if [ \"${PYTEST_SETUP_TEST_DATABASES}\" = true ]; then pytest --create-db --reuse-db --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --no-cov --disable-warnings -r=fEs --verbosity=3 --capture=no --log-cli-level=WARNING --show-capture=log 2> /dev/null 'usaspending_api/tests/integration/test_setup_of_test_dbs.py::test_trigger_test_db_setup'; fi;",
    "if [ \"${PYTEST_PRELOAD_SPARK_JARS}\" = true ]; then pytest --no-cov --disable-warnings -r=fEs --verbosity=3 'usaspending_api/tests/integration/test_setup_of_spark_dependencies.py::test_preload_spark_jars'; fi;",
    "psql postgres://${USASPENDING_DB_USER}:${USASPENDING_DB_PASSWORD}@${USASPENDING_DB_HOST}:${USASPENDING_DB_PORT}/postgres -c \"\\l\"",
    "mkdir -p \"${MINIO_DATA_DIR}\"",
    "mkdir -p \"${HOME}/.ivy2\"",
    "make docker-compose-up-s3 args=\"-d\""
  ],
  "script": [
    "stty cols 240",
    "cd ${TRAVIS_BUILD_DIR}",
    "return $(pytest --collect-only --quiet --ignore-glob='**/tests/integration/*' -m '(spark or database or elasticsearch)' --no-cov --disable-warnings | grep '^usaspending_api.*$' | wc -l)",
    "test $? -gt 0 && echo 'Failing because integration tests would be improperly captured as unit tests. Run the previous pytest command locally to figure out which to move to a **/tests/integration/ folder'",
    "pytest --override-ini=python_files=\"${PYTEST_INCLUDE_GLOB}\" --ignore-glob=\"${PYTEST_EXCLUDE_GLOB}\" -m \"${PYTEST_MARK_EXPRESSION}\" -k \"${PYTEST_MATCH_EXPRESSION}\" --cov=usaspending_api --cov-report term --cov-report xml:coverage.$TRAVIS_JOB_INDEX.xml --reuse-db -r=fEs --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --dist worksteal --verbosity=1 --durations 50"
  ]
}