Skip to content

Commit

Permalink
Merge de54c2d into 15b8b85
Browse files Browse the repository at this point in the history
  • Loading branch information
Kirill888 committed Dec 23, 2019
2 parents 15b8b85 + de54c2d commit 027f91f
Show file tree
Hide file tree
Showing 29 changed files with 1,184 additions and 301 deletions.
65 changes: 65 additions & 0 deletions .github/workflows/docker-test-runner.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
name: Docker (test runner)

on:
push:
paths:
- 'docker/**'
- '.github/workflows/docker-test-runner.yml'
- 'setup.py'

env:
ORG: opendatacube
IMAGE: datacube-tests
BUILDER_TAG: _build_cache
DOCKER_USER: gadockersvc


jobs:
build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v1
- name: Set up Python
uses: actions/setup-python@v1
with:
python-version: '3.6'

# This is just to get dependencies right, we do not keep datacube in the final image
- name: Build datacube source distribution
run: |
mkdir -p ./docker/dist/
find ./docker/dist/ -type f -delete
python setup.py sdist --dist-dir ./docker/dist/
ls -lh ./docker/dist/
- name: Pull docker cache
run: |
docker pull ${ORG}/${IMAGE}:latest || true
docker pull ${ORG}/${IMAGE}:${BUILDER_TAG} || true
- name: Build Test Runner Docker
run: |
# build and cache first stage (env_builder)
docker build \
--target env_builder \
--cache-from ${ORG}/${IMAGE}:${BUILDER_TAG} \
--tag ${ORG}/${IMAGE}:${BUILDER_TAG} \
./docker/
# now build second stage making sure first stage is from cache
docker build \
--cache-from ${ORG}/${IMAGE}:${BUILDER_TAG} \
--cache-from ${ORG}/${IMAGE}:latest \
--tag ${ORG}/${IMAGE}:latest \
./docker/
- name: DockerHub Push
if: |
github.ref == 'refs/heads/master' ||
github.ref == 'refs/heads/kk-gh-actions'
run: |
echo "Login to DockerHub as ${DOCKER_USER}"
echo "${{ secrets.DockerPassword }}" | docker login -u "${DOCKER_USER}" --password-stdin
docker push ${ORG}/${IMAGE}:${BUILDER_TAG}
docker push ${ORG}/${IMAGE}:latest
94 changes: 94 additions & 0 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
name: build

on:
pull_request:
paths:
- '**'

push:
paths:
- '**'
- '!.github/**'
- '.github/workflows/main.yml'
- '!docker/**'
- '!examples/**'
- '!docs/**'
- '!contrib/**'

env:
DKR: opendatacube/datacube-tests:latest

jobs:
main:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v1
with:
fetch-depth: 0

- name: Pull Docker
run: |
docker pull ${DKR}
- name: Check Code Style
run: |
docker run --rm \
-v $(pwd):/src/datacube-core \
-e SKIP_DB=yes \
${DKR} \
pycodestyle tests integration_tests examples utils --max-line-length 120
- name: Lint Code
run: |
docker run --rm \
-v $(pwd):/src/datacube-core \
-e SKIP_DB=yes \
${DKR} \
pylint -j 2 --reports no datacube datacube_apps
- name: Run Tests
run: |
docker run --rm \
-v $(pwd):/src/datacube-core \
${DKR} \
pytest -r a \
--cov datacube \
--cov-report=xml \
--doctest-ignore-import-errors \
--durations=5 \
datacube \
tests \
datacube_apps \
integration_tests
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1
with:
token: ${{ secrets.CodeCovToken }}
file: ./coverage.xml
fail_ci_if_error: true

- name: Build Packages
run: |
docker run --rm \
-v $(pwd):/src/datacube-core \
-e SKIP_DB=yes \
${DKR} \
python setup.py bdist_wheel sdist
ls -lh ./dist/
- name: Publish to dea packages repo
if: |
github.ref == 'refs/heads/develop'
&& github.event_name == 'push'
run: |
echo "Using Keys: ...${AWS_ACCESS_KEY_ID:(-4)}/...${AWS_SECRET_ACCESS_KEY:(-4)}"
aws s3 ls ${S3_DST}
aws s3 cp ./dist/datacube-*whl "${S3_DST}/"
aws s3 cp ./dist/datacube-*tar.gz "${S3_DST}/"
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
S3_DST: 's3://datacube-core-deployment/datacube/'
1 change: 0 additions & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@ env:

language: python
python:
- "3.5"
- "3.6"

cache:
Expand Down
71 changes: 0 additions & 71 deletions Dockerfile

This file was deleted.

69 changes: 8 additions & 61 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -82,72 +82,19 @@ Developer setup
- Otherwise copy ``integration_tests/agdcintegration.conf`` to
``~/.datacube_integration.conf`` and edit to customise.

Docker
======

Docker for the Open Data Cube is in the early stages of development,
and more documentation and examples of how to use it will be forthcoming
soon. For now, you can build and run this Docker image from
this repository as documented below.
Alternatively one can use ``opendatacube/datacube-tests`` docker image to run
tests. This docker includes database server pre-configured for running
integration tests. Add ``--with-docker`` command line option as a first argument
to ``./check-code.sh`` script.

Example Usage
~~~~~~~~~~~~~
There are a number of environment variables in use that can be used to
configure the Open Data Cube. Some of these are built into the application
itself, and others are specific to Docker, and will be used to create a
configuration file when the container is launched.

You can build the image with a command like this:

``docker build --tag opendatacube:local .``

And it can then be run with this command:

``docker run --rm opendatacube:local``

If you don't need to build (and you shouldn't) then you can run it from
a pre-built image with:

``docker run --rm opendatacube/datacube-core``

An example of starting a container with environment variables is as follows:

.. code-block:: bash
docker run \
--rm \
-e DATACUBE_CONFIG_PATH=/opt/custom-config.conf \
-e DB_DATABASE=mycube \
-e DB_HOSTNAME=localhost \
-e DB_USERNAME=postgres \
-e DB_PASSWORD=secretpassword \
-e DB_PORT=5432 \
opendatacube/datacube-core
Additionally, you can run an Open Data Cube Docker container along with
Postgres using the Docker Compose file. For example, you can run
``docker-compose up`` and it will start up the Postgres server and Open
Data Cube next to it. To run commands in ODC, you can use ``docker-compose
run odc datacube -v system init`` or ``docker-compose run odc datacube --version``.


Environment Variables
~~~~~~~~~~~~~~~~~~~~~
Most of the below environment variables should be self explanatory, and none
are required (although it is recommended that you set them).
::

- ``DATACUBE_CONFIG_PATH`` - the path for the config file
for writing (also used by ODC for reading)
- ``DB_DATABASE`` - the name of the postgres database
- ``DB_HOSTNAME`` - the hostname of the postgres database
- ``DB_USERNAME`` - the username of the postgres database
- ``DB_PASSWORD`` - the password to used for the postgres database
- ``DB_PORT`` - the port that the postgres database is exposed on
./check-code.sh --with-docker integration_tests


.. |Build Status| image:: https://travis-ci.org/opendatacube/datacube-core.svg?branch=develop
:target: https://travis-ci.org/opendatacube/datacube-core
.. |Build Status| image:: https://github.com/opendatacube/datacube-core/workflows/build/badge.svg
:target: https://github.com/opendatacube/datacube-core/actions
.. |Coverage Status| image:: https://coveralls.io/repos/opendatacube/datacube-core/badge.svg?branch=develop&service=github
:target: https://coveralls.io/github/opendatacube/datacube-core?branch=develop
.. |Documentation Status| image:: https://readthedocs.org/projects/datacube-core/badge/?version=latest
Expand Down
8 changes: 8 additions & 0 deletions check-code.sh
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,14 @@
set -eu
set -x

if [ "${1:-}" == "--with-docker" ]; then
shift
exec docker run \
-v $(pwd):/src/datacube-core \
opendatacube/datacube-tests:latest \
./check-code.sh $@
fi

pycodestyle tests integration_tests examples utils --max-line-length 120

pylint -j 2 --reports no datacube datacube_apps
Expand Down
3 changes: 3 additions & 0 deletions datacube/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
if __name__ == "__main__":
from .config import auto_config
auto_config()
Loading

0 comments on commit 027f91f

Please sign in to comment.