Skip to content

Commit

Permalink
Porting recent develop (1.8.x) changes forward into develop-1.9 branc…
Browse files Browse the repository at this point in the history
…h. (#1459)

* GitHub actions fixes (#1433)

* Github action fixes, backported from develop-1.9

* Updates to whats_new.rst

* Capitalise "Dependabot" in whats_new.rst

* Remove duplicate GHA Workflow for docs

We used to deploy to netlify for docs previews, but now that's done to
Read The Docs. The old Workflow should be removed.

* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/adrienverge/yamllint.git: v1.30.0 → v1.31.0](https://github.com/adrienverge/yamllint.git/compare/v1.30.0...v1.31.0)

* Display error message instead of help message if a required argument isn't provided

* update whats_new

* print error message as well as usage info

* add license hook to pre-commit

* update license template and instructions, and whats_new

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add > and < to lark grammar

* refine logic

* use timestamp 0 as lowest bound instead of hardcoded date

* update whats_new

* update doco

* update whats_new

* Allow open date range in dc load and find_datasets (#1443)

* support open ended date range in query init

* allow open ended time ranges in load() and find_datasets(), also simplify logic for cli

* update doco and whats_new

* get end of datetime.now() to avoid failing tests due to second mismatches

* Minor update to documentation

Even with open bounds, dates are still inclusive of the start and end dates. Minor update to wording to make this clearer

---------

Co-authored-by: Ariana Barzinpour <ariana.barzinpour@ga.gov.au>
Co-authored-by: Robbi Bishop-Taylor <Robbi.BishopTaylor@ga.gov.au>

* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/adrienverge/yamllint.git: v1.31.0 → v1.32.0](https://github.com/adrienverge/yamllint.git/compare/v1.31.0...v1.32.0)

* add archive_less_mature option to add and update

* update whats_new

* add warning message in memory driver

* Pass X and Y Scale factors through to rasterio.warp.project. (#1450)

* Pass X and Y Scale factors through to rasterio.warp.project. Update whats_new

* Update PR number in whats_new.rst

* Remove unused import.

* Cleanup.

* Cleanup.

* Should probably just add it to the dictionary tbh.

* Respond to Kirill's comments.

* remove lineage from docs

* move archive_less_mature to abstract and allow for postgres

* move find dupes logic into a separate function

* update whats_new

* allow for a bit of leniency in datetime comparison when searching for less mature, add test case

* update whats_new

* properly add new files

* fix failing tests

* refactor doc_to_ds without adding dataset logic

* Add missing PR's to whats_new.rst and prepare for 1.8.13 release. (#1453)

* Fix gha pypi publishing condition (#1454)

* fix gha pypi publishing condition

* update whats_new

---------

Co-authored-by: Ariana Barzinpour <ariana.barzinpour@ga.gov.au>

* update ubntu installation instructions

* update wordlist

* update readme

* update wordlist again

* add a bit more info on db env variables; other misc improvements

* update barebones metadata type requirements

* fix typos, update wordlist

* fix some wording

* update integration db names

* rename agdcintegration.conf

* Always use XSCALE=1,YSCALE=1 in warp. (#1457)

* Use SCALEX=1,SCALEY=1 in both warp code-paths.

* remove data preparation page, add links to indexing guide

* fix typo, del data preparation scripts page

* increase buffer to 500ms

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: Damien Ayers <damien@omad.net>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Ariana Barzinpour <ariana.barzinpour@ga.gov.au>
Co-authored-by: Ariana-B <40238244+Ariana-B@users.noreply.github.com>
Co-authored-by: Robbi Bishop-Taylor <Robbi.BishopTaylor@ga.gov.au>
  • Loading branch information
6 people committed Jun 21, 2023
1 parent 2a7cca7 commit b198cc4
Show file tree
Hide file tree
Showing 260 changed files with 2,885 additions and 1,479 deletions.
46 changes: 0 additions & 46 deletions .github/workflows/build-docs.yml

This file was deleted.

11 changes: 10 additions & 1 deletion .github/workflows/main.yml
Expand Up @@ -27,6 +27,13 @@ jobs:
with:
fetch-depth: 0

- name: Config
id: cfg
run: |
if "${GITHUB_REF}" in "refs/tags/"*; then
echo "push_pypi=yes" >> $GITHUB_OUTPUT
fi
- uses: dorny/paths-filter@v2
id: changes
if: |
Expand Down Expand Up @@ -112,10 +119,11 @@ jobs:
ls -lh ./dist/
twine check ./dist/*
EOF
- name: Publish to PyPi
if: |
github.event_name == 'push'
&& github.ref == 'refs/heads/pypi/publish'
&& steps.cfg.outputs.push_pypi == 'yes'
run: |
if [ -n "${TWINE_PASSWORD}" ]; then
docker run --rm \
Expand All @@ -132,6 +140,7 @@ jobs:
else
echo "Skipping upload as 'PyPiToken' is not set"
fi
env:
TWINE_PASSWORD: ${{ secrets.PyPiToken }}

Expand Down
17 changes: 16 additions & 1 deletion .pre-commit-config.yaml
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.30.0
rev: v1.32.0
hooks:
- id: yamllint
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand All @@ -22,3 +22,18 @@ repos:
rev: v3.0.0a5 # Use the sha / tag you want to point at
hooks:
- id: pylint
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.1
hooks:
- id: forbid-crlf
- id: remove-crlf
- id: forbid-tabs
- id: remove-tabs
args: [--whitespaces-count, '2']
- id: insert-license
files: ./(.*).py$
args:
- --license-filepath
- license-template.txt
- --use-current-year
- --no-extra-eol
32 changes: 21 additions & 11 deletions README.rst
Expand Up @@ -49,12 +49,12 @@ Developer setup

- ``git clone https://github.com/opendatacube/datacube-core.git``

2. Create a Python environment for using the ODC. We recommend `conda <https://docs.conda.io/en/latest/miniconda.html>`__ as the
2. Create a Python environment for using the ODC. We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the
easiest way to handle Python dependencies.

::

conda create -f conda-environment.yml
mamba env create -f conda-environment.yml
conda activate cubeenv

3. Install a develop version of datacube-core.
Expand All @@ -72,26 +72,34 @@ Developer setup
pre-commit install

5. Run unit tests + PyLint
``./check-code.sh``

(this script approximates what is run by Travis. You can
alternatively run ``pytest`` yourself). Some test dependencies may need to be installed, attempt to install these using:

Install test dependencies using:

``pip install --upgrade -e '.[test]'``

If install for these fails please lodge them as issues.
If install for these fails, please lodge them as issues.

Run unit tests with:

``./check-code.sh``

(this script approximates what is run by GitHub Actions. You can
alternatively run ``pytest`` yourself).

6. **(or)** Run all tests, including integration tests.

``./check-code.sh integration_tests``

- Assumes a password-less Postgres database running on localhost called

``agdcintegration``
``pgintegration``

- Otherwise copy ``integration_tests/agdcintegration.conf`` to
- Otherwise copy ``integration_tests/integration.conf`` to
``~/.datacube_integration.conf`` and edit to customise.

- For instructions on setting up a password-less Postgres database, see
the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.


Alternatively one can use the ``opendatacube/datacube-tests`` docker image to run
tests. This docker includes database server pre-configured for running
Expand All @@ -103,11 +111,13 @@ to ``./check-code.sh`` script.
./check-code.sh --with-docker integration_tests


To run individual test in docker container
To run individual tests in a docker container

::

docker run -ti -v /home/ubuntu/datacube-core:/code opendatacube/datacube-tests:latest pytest integration_tests/test_filename.py::test_function_name
docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .

docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name


Developer setup on Ubuntu
Expand Down
2 changes: 1 addition & 1 deletion datacube/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Datacube
Expand Down
2 changes: 1 addition & 1 deletion datacube/__main__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
if __name__ == "__main__":
from .config import auto_config
Expand Down
2 changes: 1 addition & 1 deletion datacube/api/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Modules for the Storage and Access Query API
Expand Down
11 changes: 8 additions & 3 deletions datacube/api/core.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2021 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
import uuid
Expand Down Expand Up @@ -236,12 +236,17 @@ def load(self, product=None, measurements=None, output_crs=None, resolution=None
x=(1516200, 1541300), y=(-3867375, -3867350), crs='EPSG:3577'
The ``time`` dimension can be specified using a tuple of datetime objects or strings with
``YYYY-MM-DD hh:mm:ss`` format. Data will be loaded inclusive of the start and finish times. E.g::
The ``time`` dimension can be specified using a single or tuple of datetime objects or strings with
``YYYY-MM-DD hh:mm:ss`` format. Data will be loaded inclusive of the start and finish times.
A ``None`` value in the range indicates an open range, with the provided date serving as either the
upper or lower bound. E.g::
time=('2000-01-01', '2001-12-31')
time=('2000-01', '2001-12')
time=('2000', '2001')
time=('2000')
time=('2000', None) # all data from 2000 onward
time=(None, '2000') # all data up to and including 2000
For 3D datasets, where the product definition contains an ``extra_dimension`` specification,
these dimensions can be queried using that dimension's name. E.g.::
Expand Down
2 changes: 1 addition & 1 deletion datacube/api/grid_workflow.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
import xarray
Expand Down
10 changes: 7 additions & 3 deletions datacube/api/query.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Storage Query and Access API module
Expand Down Expand Up @@ -136,8 +136,8 @@ def __init__(self, index=None, product=None, geopolygon=None, like=None, **searc
if time_coord is not None:
self.search['time'] = _time_to_search_dims(
(pandas_to_datetime(time_coord.values[0]).to_pydatetime(),
pandas_to_datetime(time_coord.values[-1]).to_pydatetime()
+ datetime.timedelta(milliseconds=1)) # TODO: inclusive time searches
pandas_to_datetime(time_coord.values[-1]).to_pydatetime()
+ datetime.timedelta(milliseconds=1)) # TODO: inclusive time searches
)

@property
Expand Down Expand Up @@ -350,7 +350,11 @@ def _time_to_search_dims(time_range):
if hasattr(tr_end, 'isoformat'):
tr_end = tr_end.isoformat()

if tr_start is None:
tr_start = datetime.datetime.fromtimestamp(0)
start = _to_datetime(tr_start)
if tr_end is None:
tr_end = datetime.datetime.now().strftime("%Y-%m-%d")
end = _to_datetime(pandas.Period(tr_end)
.end_time
.to_pydatetime())
Expand Down
2 changes: 1 addition & 1 deletion datacube/config.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
User configuration.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
This module implements a simple plugin manager for storage and index drivers.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/_tools.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from threading import Lock
from typing import Any
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/_types.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
""" Defines abstract types for IO drivers.
"""
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/datasource.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
""" Defines abstract types for IO reader drivers.
"""
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/driver_cache.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
from typing import Dict, Any, Tuple, Iterable
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/indexes.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from typing import List, Optional

Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from ._write import write_dataset_to_netcdf, create_netcdf_storage_unit
from . import writer as netcdf_writer
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/_write.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from pathlib import Path
import logging
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/driver.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from urllib.parse import urlsplit

Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/writer.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Create netCDF4 Storage Units and write data to them
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Lower-level database access.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_api.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0

# We often have one-arg-per column, so these checks aren't so useful.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_connections.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0

# We often have one-arg-per column, so these checks aren't so useful.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_core.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Core SQL schema settings.
Expand Down

0 comments on commit b198cc4

Please sign in to comment.