Skip to content

Commit

Permalink
[WIP] move to poetry (#831)
Browse files Browse the repository at this point in the history
* [WIP] move to poetry

* [WIP] verbose in CI

* Fix pandas bug

* Bypass faulty type checking for pymysql connection

* Update dependencies

* Changes to test against CodeBuild unit tests and distribution build

* Remove editable install for docs

* Fix typing

Co-authored-by: Abdel Jaidi <jaidisido@gmail.com>
  • Loading branch information
maxispeicher and jaidisido committed Aug 10, 2021
1 parent 32065e2 commit 1f4e5ef
Show file tree
Hide file tree
Showing 24 changed files with 4,352 additions and 169 deletions.
4 changes: 4 additions & 0 deletions .bumpversion.cfg
Expand Up @@ -4,6 +4,10 @@ commit = False
tag = False
tag_name = {new_version}

[bumpversion:file:pyproject.toml]

[bumpversion:file:test_infra/pyproject.toml]

[bumpversion:file:README.md]

[bumpversion:file:CONTRIBUTING_COMMON_ERRORS.md]
Expand Down
4 changes: 4 additions & 0 deletions .flake8
@@ -0,0 +1,4 @@
[flake8]
max-line-length = 120
extend-ignore = E203, W503
exclude = .git,__pycache__,docs/source/conf.py,old,build,dist,.venv,.venv2,.tox,dev,.env,.coverage
4 changes: 3 additions & 1 deletion .github/workflows/minimal-tests.yml
Expand Up @@ -29,7 +29,9 @@ jobs:
- name: Install Requirements
run: |
python -m pip install --upgrade pip
pip install -U -r requirements-dev.txt
python -m pip install poetry==1.1.7
poetry config virtualenvs.create false --local
poetry install --extras "sqlserver" -vvv
- name: Test Metadata
run: pytest tests/test_metadata.py
- name: Test Session
Expand Down
4 changes: 3 additions & 1 deletion .github/workflows/static-checking.yml
Expand Up @@ -27,7 +27,9 @@ jobs:
- name: Install Requirements
run: |
python -m pip install --upgrade pip
pip install -U -r requirements-dev.txt
python -m pip install poetry==1.1.7
poetry config virtualenvs.create false --local
poetry install --extras "sqlserver" -vvv
- name: mypy check
run: mypy --install-types --non-interactive awswrangler
- name: Flake8 Lint
Expand Down
4 changes: 4 additions & 0 deletions .gitignore
Expand Up @@ -26,6 +26,10 @@ share/python-wheels/
*.egg
MANIFEST

# poetry
poetry.toml
envs.toml

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
Expand Down
18 changes: 9 additions & 9 deletions CONTRIBUTING.md
Expand Up @@ -92,7 +92,7 @@ You can choose from three different environments to test your fixes/changes, bas
### Mocked test environment

* Pick up a Linux or MacOS.
* Install Python 3.7, 3.8 or 3.9
* Install Python 3.7, 3.8 or 3.9 with [poetry](https://github.com/python-poetry/poetry) for package management
* Fork the AWS Data Wrangler repository and clone that into your development environment
* Go to the project's directory create a Python's virtual environment for the project

Expand All @@ -104,7 +104,7 @@ or

* Install dependencies:

``pip install -r requirements-dev.txt``
``poetry install --extras "sqlserver"``

* Run the validation script:

Expand All @@ -123,7 +123,7 @@ or
**DISCLAIMER**: Make sure you know what you are doing. These steps will charge some services on your AWS account and require a minimum security skill to keep your environment safe.

* Pick up a Linux or MacOS.
* Install Python 3.7, 3.8 or 3.9
* Install Python 3.7, 3.8 or 3.9 with [poetry](https://github.com/python-poetry/poetry) for package management
* Fork the AWS Data Wrangler repository and clone that into your development environment
* Go to the project's directory create a Python's virtual environment for the project

Expand All @@ -135,15 +135,15 @@ or

* Install dependencies:

``pip install -r requirements-dev.txt``
``poetry install --extras "sqlserver"``

* Go to the ``test_infra`` directory

``cd test_infra``

* Install CDK dependencies:

``pip install -r requirements.txt``
``poetry install``

* [OPTIONAL] Set AWS_DEFAULT_REGION to define the region the Data Lake Test environment will deploy into. You may want to choose a region which you don't currently use:

Expand Down Expand Up @@ -184,25 +184,25 @@ or
**DISCLAIMER**: This environment contains Aurora MySQL, Aurora PostgreSQL and Redshift (single-node) clusters which will incur cost while running.

* Pick up a Linux or MacOS.
* Install Python 3.7, 3.8 or 3.9
* Install Python 3.7, 3.8 or 3.9 with [poetry](https://github.com/python-poetry/poetry) for package management
* Fork the AWS Data Wrangler repository and clone that into your development environment
* Go to the project's directory create a Python's virtual environment for the project

`python -m venv .venv && source .venv/bin/activate`

* Then run the command bellow to install all dependencies:

``pip install -r requirements-dev.txt``
``poetry install --extras "sqlserver"``

* Go to the ``test_infra`` directory

``cd test_infra``

* Install CDK dependencies:

``pip install -r requirements.txt``
``poetry install``

* [OPTIONAL] Set AWS_DEFAULT_REGION to define the region the Full Test envrioment will deploy into. You may want to choose a region which you don't currently use:
* [OPTIONAL] Set AWS_DEFAULT_REGION to define the region the Full Test environment will deploy into. You may want to choose a region which you don't currently use:

``export AWS_DEFAULT_REGION=ap-northeast-1``

Expand Down
15 changes: 0 additions & 15 deletions MANIFEST.in

This file was deleted.

2 changes: 1 addition & 1 deletion awswrangler/_utils.py
Expand Up @@ -227,7 +227,7 @@ def chunkify(lst: List[Any], num_chunks: int = 1, max_length: Optional[int] = No
if not lst:
return []
n: int = num_chunks if max_length is None else int(math.ceil((float(len(lst)) / float(max_length))))
np_chunks = np.array_split(lst, n) # type: ignore
np_chunks = np.array_split(lst, n)
return [arr.tolist() for arr in np_chunks if len(arr) > 0]


Expand Down
5 changes: 2 additions & 3 deletions building/build-wheel.sh
Expand Up @@ -2,6 +2,5 @@
set -ex

pushd ..
rm -rf *.egg-info build dist/*.whl
python setup.py bdist_wheel
rm -rf *.egg-info build
rm -rf dist/*.whl
poetry build -f wheel
14 changes: 5 additions & 9 deletions building/lambda/Dockerfile
Expand Up @@ -12,17 +12,13 @@ RUN yum install -y \
ninja-build \
${py_dev}

RUN pip3 install --upgrade pip six cython cmake hypothesis
RUN pip3 install --upgrade pip six cython cmake hypothesis poetry==1.1.7

ADD requirements.txt /root/
RUN pip3 install -r /root/requirements.txt
WORKDIR /root

ADD requirements-dev.txt /root/
# Removing "-e ." installation
RUN head -n -3 /root/requirements-dev.txt > /root/temp.txt
RUN mv /root/temp.txt /root/requirements-dev.txt
RUN pip3 install -r /root/requirements-dev.txt
COPY pyproject.toml poetry.lock ./
RUN poetry config virtualenvs.create false --local && poetry install --no-root

RUN rm -rf /root/requirements*
RUN rm -f pyproject.toml poetry.lock

ENTRYPOINT ["/bin/sh"]
6 changes: 3 additions & 3 deletions building/lambda/build-docker-images.sh
@@ -1,8 +1,8 @@
#!/usr/bin/env bash
set -ex

cp ../../requirements.txt .
cp ../../requirements-dev.txt .
cp ../../pyproject.toml .
cp ../../poetry.lock .

# Python 3.6
docker build \
Expand All @@ -28,4 +28,4 @@ docker build \
--build-arg py_dev=python38-devel \
.

rm -rf requirements*
rm -rf pyproject.toml poetry.lock
9 changes: 3 additions & 6 deletions building/publish.sh
Expand Up @@ -2,9 +2,6 @@
set -ex

pushd ..
rm -fr build dist .egg awswrangler.egg-info
python3.6 setup.py bdist_egg
python3.6 setup.py bdist_wheel
python3.6 setup.py sdist
twine upload dist/*
rm -fr build dist .egg awswrangler.egg-info
rm -fr dist
poetry publish --build
rm -fr dist
2 changes: 1 addition & 1 deletion docs/environment.yml
Expand Up @@ -11,4 +11,4 @@ dependencies:
- sphinx==4.0.3
- sphinx_bootstrap_theme
- IPython
- -e ..
- ..

0 comments on commit 1f4e5ef

Please sign in to comment.