Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Dev-10645] Upgrade python to version 3.10.12 #4079

Open
wants to merge 79 commits into
base: qat
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
79 commits
Select commit Hold shift + click to select a range
d423ca1
DEV-10756 updated config files to ensure compatibility with python ve…
ayubshahab Mar 18, 2024
b6f5d4a
Updated setuptools version in travis.yml
ayubshahab Mar 18, 2024
6a0e203
Added changes from DEV-10745-update-library-imports as updating the c…
ayubshahab Mar 18, 2024
602c120
DEV-10756 removed unused imports that were being flagged by flake8
ayubshahab Mar 18, 2024
1acf500
DEV-10756 reformatted 138 files that were flagged by the new version …
ayubshahab Mar 18, 2024
58d29e4
DEV-10756 updated the formatting in setup.py per black requirements
ayubshahab Mar 18, 2024
42527f5
Revert "DEV-10756 removed unused imports that were being flagged by f…
ayubshahab Mar 19, 2024
4c361ab
DEV-10804 removed unused imports flagged by flake8. This ticket has n…
ayubshahab Mar 19, 2024
332b469
DEV-10805 Many files were flagged as needing to be reformatted after …
ayubshahab Mar 19, 2024
f4e4813
DEV-10748 The newer version of Django was resulting in the psycopg2.e…
ayubshahab Mar 20, 2024
a442f27
DEV-10748 The Django version 3.2.25 (3.2.*) was compatible with pytho…
ayubshahab Mar 20, 2024
505261a
DEV-10748 reverted Django library version back to 3.2.*
ayubshahab Mar 20, 2024
7120817
Merge pull request #4048 from fedspendingtransparency/dev-10756-updat…
ayubshahab Mar 20, 2024
119f90d
Merge pull request #4049 from fedspendingtransparency/dev-10804-flake…
ayubshahab Mar 20, 2024
ad9bb16
Merge pull request #4050 from fedspendingtransparency/dev-10805-black…
ayubshahab Mar 20, 2024
f6572a0
DEV-10748 Updated to Django 5.0.3 from 3.2.*
ayubshahab Mar 25, 2024
ffbd6ca
update psycopg2
boozallendanny Mar 27, 2024
a935b72
usedforsecurity
boozallendanny Mar 27, 2024
5262f29
DEV-10748 Auto-spawned new migrations from Django 5.0.3
ayubshahab Mar 27, 2024
a4cc009
DEV-10748 Fix for InvalidCursorName error
ayubshahab Mar 27, 2024
ee34cc0
DEV-10645 Fixed flake8 and black formatting issues
ayubshahab Mar 28, 2024
566f27f
Merge pull request #4052 from fedspendingtransparency/dev-10748-fix-u…
ayubshahab Apr 19, 2024
b1df20c
Pipe 528 fix for test_trigger_test_db_setup error
ayubshahab May 1, 2024
a9c649f
Merge pull request #4080 from fedspendingtransparency/pipe-528-test_t…
ayubshahab May 1, 2024
a11e1dc
Merge branch 'qat' into dev-10645-python-upgrade-3.10
ayubshahab May 1, 2024
827b67c
Pipe-527 initial commit for format_exception() failure in testing suite
ayubshahab May 1, 2024
ec9b822
PIPE-527 black formatting fix
ayubshahab May 1, 2024
0c98388
PIPE-527 fix for FiscalDate attribute change from .quarter to .fiscal…
ayubshahab May 2, 2024
cdb7c16
PIPE-527 added django.setup for format_exception() error
ayubshahab May 6, 2024
800e707
PIPE-527 upgraded django to 4.1.13
ayubshahab May 6, 2024
ab0a3de
Merge pull request #4081 from fedspendingtransparency/pipe-527-format…
ayubshahab May 7, 2024
b3cdccf
PIPE-521 upgraded spark version
ayubshahab May 7, 2024
a72ff35
Pipe 521 upgraded spark version
ayubshahab May 10, 2024
ddd171c
Pipe 521 updated try catch exception check to match terminal output
ayubshahab May 10, 2024
00f7d64
PIPE 521 added fix for try catch exception error
ayubshahab May 10, 2024
64c60a1
Pipe-521 final commit for PIPE 521
ayubshahab May 10, 2024
4661673
Merge pull request #4083 from fedspendingtransparency/pipe-521-spark-…
ayubshahab May 10, 2024
dcfeb4e
Changes Python version from 3.10.13 to 3.10.12
May 15, 2024
efaa896
PIPE-524 fix for load_rosetta command failure
ayubshahab May 16, 2024
928d3cd
Merge pull request #4087 from fedspendingtransparency/ftr/dev-10645-3…
ayubshahab May 16, 2024
e8a8577
Black formatting fixed
ayubshahab May 16, 2024
21d445c
PIPE-524 potential fix for Database queries to 'data_broker' are not …
ayubshahab May 16, 2024
c9352d9
modified test_load_to_from_delta.py
ayubshahab May 21, 2024
15fc16f
database queries to data_broker are not allowed fix
ayubshahab May 21, 2024
dd2052b
testing
ayubshahab May 21, 2024
a6f1a09
pipe-526
ayubshahab May 21, 2024
089b5f8
modified travis file to not timeout after 10 mins of inactivity
ayubshahab May 21, 2024
245014d
Updated .travis.yml
ayubshahab May 21, 2024
1b70c06
Added travis wait to Spark Integration Tests - Other
ayubshahab May 21, 2024
cc26d04
switch to rocky linux
boozallendanny May 21, 2024
42b98cf
add requirements server
boozallendanny May 21, 2024
4e28af9
PIPE-524 fix for violates foreign key constraint
ayubshahab May 23, 2024
1ba4ed8
Black and Flake8 formatting changes
ayubshahab May 23, 2024
57451bc
PIPE-524 Took out unnecessary comments
ayubshahab May 24, 2024
00ab025
reverted psycopg2 back to psycopg2-binary as there were tests additio…
ayubshahab May 24, 2024
db11173
PIPE-524 modified travis file scripts
ayubshahab May 24, 2024
6a405dd
PIPE-524 reverted to 1b70c06329a4b9c49256cfdc1845f9b1f01c4d0c
ayubshahab May 28, 2024
8368c3a
PIPE-524 removed travis_wait and added back for failing tests
ayubshahab May 28, 2024
c905a60
Merge pull request #4091 from fedspendingtransparency/pipe-524-non-sp…
ayubshahab May 28, 2024
3117799
PIPE-529 fix for test_download_transactions
ayubshahab May 29, 2024
b8b9955
Merge pull request #4099 from fedspendingtransparency/pipe-529-databa…
ayubshahab May 29, 2024
8d9eb8f
PIPE-526 set ddtrace to 0.47.0
ayubshahab May 31, 2024
5c24f86
Merge pull request #4101 from fedspendingtransparency/pipe-526-unit-t…
ayubshahab May 31, 2024
5f50bd0
psycopg2 update
boozallendanny Jun 7, 2024
236efc7
update elasticsearch
boozallendanny Jun 7, 2024
8dab2e7
fips in filter_hash
boozallendanny Jun 7, 2024
2211ed7
Merge pull request #4092 from fedspendingtransparency/mod/dockerfile-…
boozallendanny Jun 7, 2024
3309489
Fix for Travis Timeout Issues
ayubshahab Jun 13, 2024
efad351
Merge pull request #4112 from fedspendingtransparency/dev-11022-travi…
ayubshahab Jun 14, 2024
e725136
Merge branch 'qat' into dev-10645-python-upgrade-3.10
ayubshahab Jun 14, 2024
4c7fa60
Merge branch 'qat' into dev-10645-python-upgrade-3.10
collinwr Jun 14, 2024
9cad8fd
testing dockerfile
boozallendanny Jun 14, 2024
edd3b64
perl
boozallendanny Jun 14, 2024
58e9aef
another tes
boozallendanny Jun 14, 2024
c0b0d4f
perl
boozallendanny Jun 14, 2024
c3d59a4
add to path
boozallendanny Jun 14, 2024
254b61c
combine
boozallendanny Jun 14, 2024
af034e0
update for local bah
boozallendanny Jul 8, 2024
2d754b5
extra line
boozallendanny Jul 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .env.template
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@
# config data classes (e.g. DefaultConfig in default.py and/or LocalConfig local.py)
########################################################################################################################
# ==== [Python] ====
PYTHON_VERSION=3.8.16
PYTHON_VERSION=3.10.12

# ==== [App] ====
# MATVIEW_SQL_DIR has to be inside of the project (check the docker-compose file)
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
exclude: /(\.git|\.venv|venv|migrations)/
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.4.0
rev: v4.5.0
hooks:
- id: debug-statements
- id: flake8
- repo: https://github.com/psf/black
rev: 20.8b1
rev: 24.2.0
hooks:
- id: black
language_version: python3.8
language_version: python3.10
12 changes: 8 additions & 4 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ dist: bionic
language: python

python:
- '3.8'
- '3.10'

cache: pip

Expand Down Expand Up @@ -55,7 +55,7 @@ jobs:
- $TRAVIS_BUILD_DIR/usaspending_api.egg_info/
before_install: "" # override default to no-op
install:
- travis_retry pip install setuptools==60.8.2
- travis_retry pip install setuptools==65.5.0
- travis_retry pip install .[dev]
- travis_retry pip install coveralls
before_script: "" # override default to no-op
Expand Down Expand Up @@ -155,7 +155,9 @@ jobs:
name: ws3
paths:
- coverage.*.xml
# Inherits "global" job phases defined below (e.g. before_install, install, before_script, script, etc.)
# Inherits "global" job phases defined below (e.g. before_install, install, before_script, script, etc.)
script:
- travis_wait 30 pytest --override-ini=python_files="${PYTEST_INCLUDE_GLOB}" --ignore-glob="${PYTEST_EXCLUDE_GLOB}" -m "${PYTEST_MARK_EXPRESSION}" -k "${PYTEST_MATCH_EXPRESSION}" --cov=usaspending_api --cov-report term --cov-report xml:coverage.$TRAVIS_JOB_INDEX.xml --reuse-db -r=fEs --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --dist worksteal --verbosity=1 --durations=0
- name: Spark Integration Tests - Other
env:
- PYTEST_XDIST_NUMPROCESSES=4
Expand All @@ -172,6 +174,8 @@ jobs:
paths:
- coverage.*.xml
# Inherits "global" job phases defined below (e.g. before_install, install, before_script, script, etc.)
script:
- travis_wait 30 pytest --override-ini=python_files="${PYTEST_INCLUDE_GLOB}" --ignore-glob="${PYTEST_EXCLUDE_GLOB}" -m "${PYTEST_MARK_EXPRESSION}" -k "${PYTEST_MATCH_EXPRESSION}" --cov=usaspending_api --cov-report term --cov-report xml:coverage.$TRAVIS_JOB_INDEX.xml --reuse-db -r=fEs --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --dist worksteal --verbosity=1 --durations=0
- name: Non-Spark Integration Tests
env:
- PYTEST_SETUP_TEST_DATABASES=true
Expand Down Expand Up @@ -306,4 +310,4 @@ script:
- return $(pytest --collect-only --quiet --ignore-glob='**/tests/integration/*' -m '(spark or database or elasticsearch)' --no-cov --disable-warnings | grep '^usaspending_api.*$' | wc -l)
- test $? -gt 0 && echo 'Failing because integration tests would be improperly captured as unit tests. Run the previous pytest command locally to figure out which to move to a **/tests/integration/ folder'
# Must manually set --numprocesses on Travis CI VMs; can't use auto (see: https://github.com/pytest-dev/pytest-xdist/pull/317)
- pytest --override-ini=python_files="${PYTEST_INCLUDE_GLOB}" --ignore-glob="${PYTEST_EXCLUDE_GLOB}" -m "${PYTEST_MARK_EXPRESSION}" -k "${PYTEST_MATCH_EXPRESSION}" --cov=usaspending_api --cov-report term --cov-report xml:coverage.$TRAVIS_JOB_INDEX.xml --reuse-db -r=fEs --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --dist worksteal --verbosity=1 --durations 50
- pytest --override-ini=python_files="${PYTEST_INCLUDE_GLOB}" --ignore-glob="${PYTEST_EXCLUDE_GLOB}" -m "${PYTEST_MARK_EXPRESSION}" -k "${PYTEST_MATCH_EXPRESSION}" --cov=usaspending_api --cov-report term --cov-report xml:coverage.$TRAVIS_JOB_INDEX.xml --reuse-db -r=fEs --numprocesses ${PYTEST_XDIST_NUMPROCESSES} --dist worksteal --verbosity=1 --durations=0
32 changes: 24 additions & 8 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -6,21 +6,35 @@

# See docker-compose.yml file and README.md for docker-compose information

FROM centos:7
FROM rockylinux:8

# Build ARGs
ARG PYTHON_VERSION=3.8.16
ARG PYTHON_VERSION=3.10.12

WORKDIR /dockermount

RUN yum -y update && yum clean all
# update to use centos official mirrors only
RUN sed -i '/#baseurl/s/^#//g' /etc/yum.repos.d/Rocky-*
RUN sed -i '/mirrorlist/s/^/#/g' /etc/yum.repos.d/Rocky-*

RUN dnf -y update
# sqlite-devel added as prerequisite for coverage python lib, used by pytest-cov plugin
RUN yum -y install wget gcc openssl-devel bzip2-devel libffi libffi-devel zlib-devel sqlite-devel xz-devel
RUN yum -y groupinstall "Development Tools"
RUN dnf -y install gcc openssl-devel bzip2-devel libffi-devel zlib-devel wget make
RUN dnf -y groupinstall "Development Tools"

RUN dnf install epel-release -y
RUN dnf --enablerepo=powertools install perl-IPC-Run -y


##### Install PostgreSQL 13 client (psql)
RUN yum -y install https://download.postgresql.org/pub/repos/yum/reporpms/EL-7-x86_64/pgdg-redhat-repo-latest.noarch.rpm
RUN yum -y install postgresql13
## Import and install not working on local BAH computers
#RUN rpm --import https://download.postgresql.org/pub/repos/yum/keys/RPM-GPG-KEY-PGDG-AARCH64-RHEL8
#RUN dnf -y install https://download.postgresql.org/pub/repos/yum/reporpms/EL-8-x86_64/pgdg-redhat-repo-latest.noarch.rpm

RUN dnf -y module enable postgresql:13
RUN dnf -y install postgresql
RUN dnf -y install postgresql-devel


##### Building python 3.x
WORKDIR /usr/src
Expand All @@ -35,7 +49,9 @@ RUN echo "$(python3 --version)"
##### Copy python packaged
WORKDIR /dockermount
COPY requirements/ /dockermount/requirements/
RUN python3 -m pip install -r requirements/requirements.txt
RUN export PATH=$PATH:/usr/pgsql-13/bin && python3 -m pip install -r requirements/requirements.txt

RUN python3 -m pip install -r requirements/requirements-server.txt ansible==2.9.15 awscli

##### Copy the rest of the project files into the container
COPY . /dockermount
Expand Down
6 changes: 3 additions & 3 deletions Dockerfile.spark
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ FROM centos:7
# Forcing back to python 3.8 to be in sync with local dev env.
# Can't run driver and worker on different Python versions when driver is local dev machine and not spark-submit container
#ARG PYTHON_VERSION=3.8.10
ARG PYTHON_VERSION=3.8.16
ARG HADOOP_VERSION=3.3.1
ARG SPARK_VERSION=3.2.1
ARG PYTHON_VERSION=3.10.12
ARG SPARK_VERSION=3.5.0
ARG HADOOP_VERSION=3.3.4
ARG PROJECT_LOG_DIR=/logs

RUN yum -y update && yum clean all
Expand Down
10 changes: 5 additions & 5 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ endif
# default ENV_CODE to lcl if not set
ENV_CODE ?= lcl
# default version if not set in .env or an env var
PYTHON_VERSION ?= 3.8.16
PYTHON_VERSION ?= 3.10.12
venv_name := usaspending-api
docker_compose_file := docker-compose.yml
dockerfile_for_spark := Dockerfile.spark
Expand Down Expand Up @@ -151,7 +151,7 @@ clean-all: confirm-clean-all ## Remove all tmp artifacts and artifacts created
ifeq ($(strip ${dry-run}),'false')
rm -f .python-version
rm -rf .venv
@git clean -xfd --exclude='\.env' --exclude='\.envrc' --exclude='\.idea/' --exclude='spark-warehouse/'
@git clean -xfd --exclude='\.env' --exclude='\.envrc' --exclude='\.idea/' --exclude='spark-warehouse/' --exclude='\.vscode/'
deactivate || true
#if command -v deactivate &> /dev/null; then deactivate; fi;
else # this is a dry-run, spit out what would be removed
Expand Down Expand Up @@ -237,7 +237,7 @@ docker-compose-spark-submit: ## Run spark-submit from within local docker contai
-e DATABASE_URL=${DATABASE_URL} \
spark-submit \
--driver-memory "2g" \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-core_2.12:1.2.1,org.apache.hadoop:hadoop-aws:3.3.1,org.apache.spark:spark-hive_2.12:3.2.1 \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-spark_2.12:3.2.0,org.apache.hadoop:hadoop-aws:3.3.4,org.apache.spark:spark-hive_2.12:3.2.1 \
${if ${python_script}, \
${python_script}, \
/project/manage.py ${django_command} \
Expand All @@ -248,7 +248,7 @@ localhost-spark-submit: ## Run spark-submit from with localhost as the driver an
SPARK_LOCAL_IP=127.0.0.1 \
spark-submit \
--driver-memory "2g" \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-core_2.12:1.2.1,org.apache.hadoop:hadoop-aws:3.3.1,org.apache.spark:spark-hive_2.12:3.2.1 \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-spark_2.12:3.2.0,org.apache.hadoop:hadoop-aws:3.3.4,org.apache.spark:spark-hive_2.12:3.2.1 \
${if ${python_script}, \
${python_script}, \
manage.py ${django_command} \
Expand All @@ -257,7 +257,7 @@ localhost-spark-submit: ## Run spark-submit from with localhost as the driver an
.PHONY: pyspark-shell
pyspark-shell: ## Launch a local pyspark REPL shell with all of the packages and spark config pre-set
SPARK_LOCAL_IP=127.0.0.1 pyspark \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-core_2.12:1.2.1,org.apache.hadoop:hadoop-aws:3.3.1 \
--packages org.postgresql:postgresql:42.2.23,io.delta:delta-spark_2.12:3.2.0,org.apache.hadoop:hadoop-aws:3.3.4 \
--conf spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--conf spark.hadoop.fs.s3a.endpoint=localhost:${MINIO_PORT} \
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ _**If not using Docker, you'll need to install app components on your machine:**
- Linux users already know their package manager (`yum`, `apt`, `pacman`, etc.)
- [`PostgreSQL`](https://www.postgresql.org/download/) version 13.x (with a dedicated `data_store_api` database)
- [`Elasticsearch`](https://www.elastic.co/downloads/elasticsearch) version 7.1
- `Python` version 3.8 environment
- `Python` version 3.10 environment
- Highly recommended to use a virtual environment. There are various tools and associated instructions depending on preferences
- See [Required Python Libraries](#required-python-libraries) for an example using `pyenv`

Expand Down
6 changes: 3 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ services:
- usaspending # must pass --profile usaspending to docker-compose for this to come up
- test
- ci
image: docker.elastic.co/elasticsearch/elasticsearch:7.1.1
image: docker.elastic.co/elasticsearch/elasticsearch:7.8.0
container_name: usaspending-es
environment:
- node.name=usaspending-es
Expand All @@ -168,8 +168,8 @@ services:
/bin/sh -c "
if [ ! -d /usr/share/elasticsearch/plugins/mapper-murmur3 ]; then
# Certificate problem workaround when on VPN - wget without checking cert, then install from local filesystem
wget --no-check-certificate https://artifacts.elastic.co/downloads/elasticsearch-plugins/mapper-murmur3/mapper-murmur3-7.1.1.zip
./bin/elasticsearch-plugin install file:///usr/share/elasticsearch/mapper-murmur3-7.1.1.zip
curl https://artifacts.elastic.co/downloads/elasticsearch-plugins/mapper-murmur3/mapper-murmur3-7.8.0.zip -O
./bin/elasticsearch-plugin install file:///usr/share/elasticsearch/mapper-murmur3-7.8.0.zip
fi
/usr/local/bin/docker-entrypoint.sh"
ulimits:
Expand Down
2 changes: 2 additions & 0 deletions manage.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
# exceptions on Python
try:
import django # noqa

django.setup()
except ImportError:
raise ImportError(
"Couldn't import Django. Are you sure it's installed and "
Expand Down
72 changes: 36 additions & 36 deletions requirements/requirements-app.txt
Original file line number Diff line number Diff line change
@@ -1,36 +1,36 @@
asyncpg==0.21.*
attrs==20.*
boto3==1.16.*
certifi==2020.12.5
dataclasses-json==0.5.*
ddtrace==0.46.0
dj-database-url==0.5.0
django-cors-headers==3.11.*
django-debug-toolbar==3.4.*
django-extensions==3.1.*
django-spaghetti-and-meatballs==0.4.0
Django==3.2.*
django_cte==1.2.*
djangorestframework==3.13.*
docutils==0.15.2
drf-api-tracking==1.8.0
drf-extensions==0.7.*
elasticsearch-dsl==7.1.0
elasticsearch==7.1.0
et-xmlfile==1.0.1
filelock==3.0.12
fiscalyear==0.2.0
Markdown<3.0
marshmallow==3.14.1
numpy==1.20.*
openpyxl==2.4.7
pandas==1.3.*
psutil==5.6.*
psycopg2-binary==2.8.*
py-gfm==0.1.4
pydantic[dotenv]==1.9.0
python-json-logger==0.1.9
requests==2.25.*
retrying==1.3.3
urllib3==1.26.*
xlrd3==1.0.0
asyncpg==0.29.0
attrs==23.2.0
boto3==1.34.58
certifi==2024.2.2
dataclasses-json==0.6.4
ddtrace==0.47.0
dj-database-url==2.1.0
django-cors-headers==4.3.1
django-debug-toolbar==4.3.0
django-extensions==3.2.3
django-spaghetti-and-meatballs==0.4.2
Django==4.1.13
django_cte==1.3.2
djangorestframework==3.14.0
docutils==0.20.1
drf-api-tracking==1.8.4
drf-extensions==0.7.1
elasticsearch-dsl==7.4.1
elasticsearch==7.10.*
et-xmlfile==1.1.0
filelock==3.13.1
fiscalyear==0.4.0
Markdown==3.5.2
marshmallow==3.21.1
numpy==1.26.4
openpyxl==3.1.2
pandas==2.2.1
psutil==5.9.8
psycopg2==2.9.9
py-gfm==2.0.0
pydantic[dotenv]==1.9.*
python-json-logger==2.0.7
requests==2.31.0
retrying==1.3.4
urllib3==1.26.18
xlrd3==1.1.0
24 changes: 12 additions & 12 deletions requirements/requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
black==20.8b1
click==8.0.4
docker==5.0.3
black==24.2.0
click==8.1.7
docker==7.0.0
dredd-hooks==0.2.0
flake8==3.8.4
importlib-metadata<5 # need to pin <5 while flake8 is on 3.8.4
flake8==7.0.0
importlib-metadata==7.0.2
mock==5.1.*
model-bakery==1.13.*
pip==23.2.*
pre-commit==1.20.0
pyspark==3.2.1
pytest==7.4.*
model-bakery==1.17.*
pip==23.0.1
pre-commit==3.6.2
pyspark==3.5.1
pytest==8.0.*
pytest-cov==4.1.*
pytest-django==4.2.* # Going > 4.2 yields multi-db errors
pytest-django==4.8.*
pytest-pretty==1.2.*
pytest-xdist==3.3.*
pytest-xdist==3.5.*
4 changes: 3 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
- Details of setuptools.setup parameters:
- https://packaging.python.org/en/latest/guides/distributing-packages-using-setuptools/#setup-args
"""

import pathlib

from setuptools import find_packages, setup
Expand Down Expand Up @@ -38,7 +39,7 @@
),
long_description=(_PROJECT_ROOT_DIR / "README.md").read_text(encoding="utf-8"),
long_description_content_type="text/markdown",
python_requires="==3.8.*",
python_requires="==3.10.*",
license=(_PROJECT_ROOT_DIR / "LICENSE").read_text(encoding="utf-8"),
packages=find_packages(),
include_package_data=True, # see MANIFEST.in for what is included
Expand All @@ -49,6 +50,7 @@
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3 :: Only",
],
)
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ class Meta:

class AppropriationAccountBalancesManager(models.Manager):
def get_queryset(self):
""" Get only records from the last submission per TAS per fiscal year. """
"""Get only records from the last submission per TAS per fiscal year."""
return super(AppropriationAccountBalancesManager, self).get_queryset().filter(final_of_fy=True)


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ def model_instances():

@pytest.mark.django_db
def test_list_budget_functions_unique(model_instances, client):
""" Ensure the list_budget_functions endpoint returns unique values """
"""Ensure the list_budget_functions endpoint returns unique values"""
response = client.get("/api/v2/budget_functions/list_budget_functions/")

assert response.status_code == status.HTTP_200_OK
Expand All @@ -98,7 +98,7 @@ def test_list_budget_functions_unique(model_instances, client):

@pytest.mark.django_db
def test_list_budget_subfunctions_unique(model_instances, client):
""" Ensure the list_budget_subfunctions endpoint returns unique values """
"""Ensure the list_budget_subfunctions endpoint returns unique values"""
response = client.post(
"/api/v2/budget_functions/list_budget_subfunctions/", content_type="application/json", data=json.dumps({})
)
Expand All @@ -110,7 +110,7 @@ def test_list_budget_subfunctions_unique(model_instances, client):

@pytest.mark.django_db
def test_list_budget_subfunctions_filter(model_instances, client):
""" Ensure the list_budget_subfunctions endpoint filters by the budget_function_code """
"""Ensure the list_budget_subfunctions endpoint filters by the budget_function_code"""
response = client.post(
"/api/v2/budget_functions/list_budget_subfunctions/",
content_type="application/json",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ def test_federal_account_spending_by_category(client, financial_spending_data):
assert "results" in resp.json()
results = resp.json()["results"]
assert len(results)
for (k, v) in results.items():
for k, v in results.items():
assert isinstance(k, str)
assert hasattr(v, "__pow__") # is a number

Expand Down
Loading