Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Porting recent develop (1.8.x) changes forward into develop-1.9 branch. #1459

Merged
merged 58 commits into from Jun 21, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
730007f
GitHub actions fixes (#1433)
SpacemanPaul Apr 6, 2023
ab1c06b
Remove duplicate GHA Workflow for docs
omad Mar 26, 2023
082933f
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Apr 24, 2023
04ffcd6
Display error message instead of help message if a required argument …
May 8, 2023
c435a91
update whats_new
May 8, 2023
8c259c1
print error message as well as usage info
May 8, 2023
84691de
add license hook to pre-commit
May 8, 2023
08c6d70
update license template and instructions, and whats_new
May 8, 2023
9074c89
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 8, 2023
a94629c
add > and < to lark grammar
May 9, 2023
305b90f
refine logic
May 9, 2023
0e9c2b8
use timestamp 0 as lowest bound instead of hardcoded date
May 9, 2023
583901d
update whats_new
May 9, 2023
18c1f2c
update doco
May 10, 2023
df7f48f
update whats_new
May 10, 2023
cae9c04
Merge pull request #1437 from opendatacube/better_cli_errors
Ariana-B May 10, 2023
801c482
Merge branch 'develop' into pre_commit_license_hook
Ariana-B May 10, 2023
e001515
Merge pull request #1438 from opendatacube/pre_commit_license_hook
Ariana-B May 10, 2023
11e9dc6
Merge branch 'develop' into search_open_ranges
Ariana-B May 10, 2023
72ccd88
Merge pull request #1439 from opendatacube/search_open_ranges
Ariana-B May 10, 2023
3506cff
Allow open date range in dc load and find_datasets (#1443)
Ariana-B May 22, 2023
032265a
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] May 22, 2023
00ce7fe
add archive_less_mature option to add and update
May 30, 2023
6a20bfb
update whats_new
May 30, 2023
0639c58
add warning message in memory driver
May 31, 2023
691e689
Pass X and Y Scale factors through to rasterio.warp.project. (#1450)
SpacemanPaul May 31, 2023
e572834
Merge branch 'develop' into cli_maturity_model
Ariana-B May 31, 2023
f95ae0e
remove lineage from docs
May 31, 2023
e964ed1
move archive_less_mature to abstract and allow for postgres
May 31, 2023
b34d375
Merge pull request #1451 from opendatacube/cli_maturity_model
Ariana-B May 31, 2023
6d4dee7
move find dupes logic into a separate function
May 31, 2023
e3fab2b
update whats_new
May 31, 2023
2807c44
allow for a bit of leniency in datetime comparison when searching for…
Jun 5, 2023
1e65f38
update whats_new
Jun 5, 2023
1b9b89a
properly add new files
Jun 6, 2023
11af4db
fix failing tests
Jun 6, 2023
c6f73fe
refactor doc_to_ds without adding dataset logic
Jun 6, 2023
ec00a7f
Merge pull request #1452 from opendatacube/cli_maturity_model_2
Ariana-B Jun 6, 2023
0dd8292
Add missing PR's to whats_new.rst and prepare for 1.8.13 release. (#1…
SpacemanPaul Jun 6, 2023
c8795e3
Fix gha pypi publishing condition (#1454)
Ariana-B Jun 7, 2023
e3acb39
update ubntu installation instructions
Jun 7, 2023
26ee1b9
update wordlist
Jun 7, 2023
e063f05
update readme
Jun 7, 2023
5a33ad7
update wordlist again
Jun 7, 2023
f8f29c0
add a bit more info on db env variables; other misc improvements
Jun 9, 2023
1a07383
update barebones metadata type requirements
Jun 13, 2023
4d7d587
fix typos, update wordlist
Jun 13, 2023
3a7c322
fix some wording
Jun 14, 2023
c60daae
update integration db names
Jun 14, 2023
075bcda
rename agdcintegration.conf
Jun 14, 2023
7c7f460
Always use XSCALE=1,YSCALE=1 in warp. (#1457)
SpacemanPaul Jun 19, 2023
e632f9a
remove data preparation page, add links to indexing guide
Jun 19, 2023
acac9c0
fix typo, del data preparation scripts page
Jun 20, 2023
d8ccecd
Merge pull request #1455 from opendatacube/doco_improvements
Ariana-B Jun 20, 2023
d7e20c9
increase buffer to 500ms
Jun 21, 2023
148e296
Merge pull request #1458 from opendatacube/increase_maturity_buffer
Ariana-B Jun 21, 2023
2d16eed
Merge branch 'develop' into develop-1.9-fwdport
SpacemanPaul Jun 21, 2023
61b08cf
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jun 21, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
46 changes: 0 additions & 46 deletions .github/workflows/build-docs.yml

This file was deleted.

11 changes: 10 additions & 1 deletion .github/workflows/main.yml
Expand Up @@ -27,6 +27,13 @@ jobs:
with:
fetch-depth: 0

- name: Config
id: cfg
run: |
if "${GITHUB_REF}" in "refs/tags/"*; then
echo "push_pypi=yes" >> $GITHUB_OUTPUT
fi

- uses: dorny/paths-filter@v2
id: changes
if: |
Expand Down Expand Up @@ -112,10 +119,11 @@ jobs:
ls -lh ./dist/
twine check ./dist/*
EOF

- name: Publish to PyPi
if: |
github.event_name == 'push'
&& github.ref == 'refs/heads/pypi/publish'
&& steps.cfg.outputs.push_pypi == 'yes'
run: |
if [ -n "${TWINE_PASSWORD}" ]; then
docker run --rm \
Expand All @@ -132,6 +140,7 @@ jobs:
else
echo "Skipping upload as 'PyPiToken' is not set"
fi

env:
TWINE_PASSWORD: ${{ secrets.PyPiToken }}

Expand Down
17 changes: 16 additions & 1 deletion .pre-commit-config.yaml
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.30.0
rev: v1.32.0
hooks:
- id: yamllint
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand All @@ -22,3 +22,18 @@ repos:
rev: v3.0.0a5 # Use the sha / tag you want to point at
hooks:
- id: pylint
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.1
hooks:
- id: forbid-crlf
- id: remove-crlf
- id: forbid-tabs
- id: remove-tabs
args: [--whitespaces-count, '2']
- id: insert-license
files: ./(.*).py$
args:
- --license-filepath
- license-template.txt
- --use-current-year
- --no-extra-eol
32 changes: 21 additions & 11 deletions README.rst
Expand Up @@ -49,12 +49,12 @@ Developer setup

- ``git clone https://github.com/opendatacube/datacube-core.git``

2. Create a Python environment for using the ODC. We recommend `conda <https://docs.conda.io/en/latest/miniconda.html>`__ as the
2. Create a Python environment for using the ODC. We recommend `Mambaforge <https://mamba.readthedocs.io/en/latest/user_guide/mamba.html>`__ as the
easiest way to handle Python dependencies.

::

conda create -f conda-environment.yml
mamba env create -f conda-environment.yml
conda activate cubeenv

3. Install a develop version of datacube-core.
Expand All @@ -72,26 +72,34 @@ Developer setup
pre-commit install

5. Run unit tests + PyLint
``./check-code.sh``

(this script approximates what is run by Travis. You can
alternatively run ``pytest`` yourself). Some test dependencies may need to be installed, attempt to install these using:

Install test dependencies using:

``pip install --upgrade -e '.[test]'``

If install for these fails please lodge them as issues.
If install for these fails, please lodge them as issues.

Run unit tests with:

``./check-code.sh``

(this script approximates what is run by GitHub Actions. You can
alternatively run ``pytest`` yourself).

6. **(or)** Run all tests, including integration tests.

``./check-code.sh integration_tests``

- Assumes a password-less Postgres database running on localhost called

``agdcintegration``
``pgintegration``

- Otherwise copy ``integration_tests/agdcintegration.conf`` to
- Otherwise copy ``integration_tests/integration.conf`` to
``~/.datacube_integration.conf`` and edit to customise.

- For instructions on setting up a password-less Postgres database, see
the `developer setup instructions <https://datacube-core.readthedocs.io/en/latest/installation/setup/ubuntu.html#postgres-database-configuration>`__.


Alternatively one can use the ``opendatacube/datacube-tests`` docker image to run
tests. This docker includes database server pre-configured for running
Expand All @@ -103,11 +111,13 @@ to ``./check-code.sh`` script.
./check-code.sh --with-docker integration_tests


To run individual test in docker container
To run individual tests in a docker container

::

docker run -ti -v /home/ubuntu/datacube-core:/code opendatacube/datacube-tests:latest pytest integration_tests/test_filename.py::test_function_name
docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile .

docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name


Developer setup on Ubuntu
Expand Down
2 changes: 1 addition & 1 deletion datacube/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Datacube
Expand Down
2 changes: 1 addition & 1 deletion datacube/__main__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
if __name__ == "__main__":
from .config import auto_config
Expand Down
2 changes: 1 addition & 1 deletion datacube/api/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Modules for the Storage and Access Query API
Expand Down
11 changes: 8 additions & 3 deletions datacube/api/core.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2021 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
import uuid
Expand Down Expand Up @@ -236,12 +236,17 @@ def load(self, product=None, measurements=None, output_crs=None, resolution=None

x=(1516200, 1541300), y=(-3867375, -3867350), crs='EPSG:3577'

The ``time`` dimension can be specified using a tuple of datetime objects or strings with
``YYYY-MM-DD hh:mm:ss`` format. Data will be loaded inclusive of the start and finish times. E.g::
The ``time`` dimension can be specified using a single or tuple of datetime objects or strings with
``YYYY-MM-DD hh:mm:ss`` format. Data will be loaded inclusive of the start and finish times.
A ``None`` value in the range indicates an open range, with the provided date serving as either the
upper or lower bound. E.g::

time=('2000-01-01', '2001-12-31')
time=('2000-01', '2001-12')
time=('2000', '2001')
time=('2000')
time=('2000', None) # all data from 2000 onward
time=(None, '2000') # all data up to and including 2000

For 3D datasets, where the product definition contains an ``extra_dimension`` specification,
these dimensions can be queried using that dimension's name. E.g.::
Expand Down
2 changes: 1 addition & 1 deletion datacube/api/grid_workflow.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
import xarray
Expand Down
10 changes: 7 additions & 3 deletions datacube/api/query.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Storage Query and Access API module
Expand Down Expand Up @@ -136,8 +136,8 @@ def __init__(self, index=None, product=None, geopolygon=None, like=None, **searc
if time_coord is not None:
self.search['time'] = _time_to_search_dims(
(pandas_to_datetime(time_coord.values[0]).to_pydatetime(),
pandas_to_datetime(time_coord.values[-1]).to_pydatetime()
+ datetime.timedelta(milliseconds=1)) # TODO: inclusive time searches
pandas_to_datetime(time_coord.values[-1]).to_pydatetime()
+ datetime.timedelta(milliseconds=1)) # TODO: inclusive time searches
)

@property
Expand Down Expand Up @@ -350,7 +350,11 @@ def _time_to_search_dims(time_range):
if hasattr(tr_end, 'isoformat'):
tr_end = tr_end.isoformat()

if tr_start is None:
tr_start = datetime.datetime.fromtimestamp(0)
start = _to_datetime(tr_start)
if tr_end is None:
tr_end = datetime.datetime.now().strftime("%Y-%m-%d")
end = _to_datetime(pandas.Period(tr_end)
.end_time
.to_pydatetime())
Expand Down
2 changes: 1 addition & 1 deletion datacube/config.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
User configuration.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
This module implements a simple plugin manager for storage and index drivers.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/_tools.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from threading import Lock
from typing import Any
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/_types.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
""" Defines abstract types for IO drivers.
"""
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/datasource.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
""" Defines abstract types for IO reader drivers.
"""
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/driver_cache.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
import logging
from typing import Dict, Any, Tuple, Iterable
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/indexes.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from typing import List, Optional

Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from ._write import write_dataset_to_netcdf, create_netcdf_storage_unit
from . import writer as netcdf_writer
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/_write.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from pathlib import Path
import logging
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/driver.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
from urllib.parse import urlsplit

Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/netcdf/writer.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Create netCDF4 Storage Units and write data to them
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/__init__.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Lower-level database access.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_api.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0

# We often have one-arg-per column, so these checks aren't so useful.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_connections.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0

# We often have one-arg-per column, so these checks aren't so useful.
Expand Down
2 changes: 1 addition & 1 deletion datacube/drivers/postgis/_core.py
@@ -1,6 +1,6 @@
# This file is part of the Open Data Cube, see https://opendatacube.org for more information
#
# Copyright (c) 2015-2020 ODC Contributors
# Copyright (c) 2015-2023 ODC Contributors
# SPDX-License-Identifier: Apache-2.0
"""
Core SQL schema settings.
Expand Down