Skip to content
Open Data Cube analyses continental scale Earth Observation data through time
Python Jupyter Notebook Other
Branch: develop
Clone or download
Latest commit 03c8f5f Aug 21, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Add templates for pull requests and issues May 16, 2017
contrib/notebooks documentation update for indexing in readthedocs Sep 13, 2018
datacube Address linting warning about variable name: `l` Aug 20, 2019
datacube_apps Fix for python 3.7 keywords Apr 10, 2019
docker Add an entrypoint to set up the database credentials Mar 14, 2018
docs update link text Aug 14, 2019
examples new_datasource now takes BandInfo structure instead Dataset+band_name Jan 21, 2019
integration_tests LocalPath don't seem to work with PosixPath in python 3.5 May 16, 2019
tests Add `rio_slurp_xarray` helper method Aug 19, 2019
utils Deleting all prepare scripts Aug 12, 2019
.coveragerc remove from coverage Jan 11, 2017
.dockerignore Dockerfile improvements Mar 28, 2018
.gitattributes Another attempt at fixing version numbers Jun 1, 2016
.gitignore Update .gitignore -- mypy and .env for vscode Jan 9, 2019
.mergify.yml Tiny Mergify tweak Feb 13, 2019
.pre-commit-config.yaml Update Developer Setup instructions in README to mention pre-commit May 13, 2019
.travis.yml Install GDAL and not pygdal on travis May 13, 2019
.yamllint Add lenient yamllint, fix example docs Aug 2, 2017
Dockerfile Install dependencies using the requirements file Mar 20, 2019
LICENSE Update LICENSE Apr 12, 2017 Another attempt at fixing version numbers Jun 1, 2016
README.rst add a link to the CoC from the README Aug 22, 2019 Merge remote-tracking branch 'origin/develop' into driver-plugins Jan 19, 2018 add Rob and Georges emails Aug 22, 2019
docker-compose.yml add docker compose May 30, 2018
pylintrc Let pylint load date parsing lib ciso8601 Jan 9, 2019
pytest.ini Add integration_tests directory to pytest config Jan 9, 2019
readthedocs.yml Fix RTD Build Jun 4, 2019
requirements-test.txt Update version of rasterio Jul 31, 2019
setup.cfg Test adding metadata types, products and datasets over HTTP May 9, 2019 Test adding metadata types, products and datasets over HTTP May 9, 2019 Future is now Jan 9, 2019


Open Data Cube Core

Build Status Coverage Status Documentation Status


The Open Data Cube Core provides an integrated gridded data analysis environment for decades of analysis ready earth observation satellite and related data from multiple satellite and other acquisition systems.


See the user guide for installation and usage of the datacube, and for documentation of the API.

Join our Slack if you need help setting up or using the Open Data Cube.

Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct.



  • PostgreSQL 9.5+
  • Python 3.5+

Developer setup

  1. Clone:
    • git clone
  2. Create a Python environment to use ODC within, we recommend conda as the easiest way to handle Python dependencies.
conda create -n odc -c conda-forge python=3.7 datacube pre_commit
conda activate odc
  1. Install a develop version of datacube-core.
cd datacube-core
pip install --upgrade -e .
  1. Install the pre-commit hooks to help follow ODC coding conventions when committing with git.
pre-commit install
  1. Run unit tests + PyLint


    (this script approximates what is run by Travis. You can alternatively run pytest yourself)

  2. (or) Run all tests, including integration tests.

    ./ integration_tests

    • Assumes a password-less Postgres database running on localhost called


    • Otherwise copy integration_tests/agdcintegration.conf to ~/.datacube_integration.conf and edit to customise.


Docker for the Open Data Cube is in the early stages of development, and more documentation and examples of how to use it will be forthcoming soon. For now, you can build and run this Docker image from this repository as documented below.

Example Usage

There are a number of environment variables in use that can be used to configure the Open Data Cube. Some of these are built into the application itself, and others are specific to Docker, and will be used to create a configuration file when the container is launched.

You can build the image with a command like this:

docker build --tag opendatacube:local .

And it can then be run with this command:

docker run --rm opendatacube:local

If you don't need to build (and you shouldn't) then you can run it from a pre-built image with:

docker run --rm opendatacube/datacube-core

An example of starting a container with environment variables is as follows:

docker run \
   --rm \
   -e DATACUBE_CONFIG_PATH=/opt/custom-config.conf \
   -e DB_DATABASE=mycube \
   -e DB_HOSTNAME=localhost \
   -e DB_USERNAME=postgres \
   -e DB_PASSWORD=secretpassword \
   -e DB_PORT=5432 \

Additionally, you can run an Open Data Cube Docker container along with Postgres using the Docker Compose file. For example, you can run docker-compose up and it will start up the Postgres server and Open Data Cube next to it. To run commands in ODC, you can use docker-compose run odc datacube -v system init or docker-compose run odc datacube --version.

Environment Variables

Most of the below environment variables should be self explanatory, and none are required (although it is recommended that you set them).

  • DATACUBE_CONFIG_PATH - the path for the config file for writing (also used by ODC for reading)
  • DB_DATABASE - the name of the postgres database
  • DB_HOSTNAME - the hostname of the postgres database
  • DB_USERNAME - the username of the postgres database
  • DB_PASSWORD - the password to used for the postgres database
  • DB_PORT - the port that the postgres database is exposed on
You can’t perform that action at this time.