These directions are for running from a local folder in development. But it will run from any typical Python WSGI server.
Firstly, install Data Cube. Use of a Data Cube conda environment is recommended.
Test that you can run
datacube system check, and that it's connecting
to the correct datacube instance.
Now install the explorer dependencies:
# These two should come from conda if you're using it, not pypi conda install fiona shapely pip install -e .
Cache some product summaries:
nohup cubedash-gen --all &>> summary-gen.log &
(This can take a while the first time, depending on your datacube size.
nohup .. & to run in the background.)
Explorer can be run using any typical python wsgi server, for example:
pip install gunicorn gunicorn -b '127.0.0.1:8080' -w 4 cubedash:app
Convenience scripts are available for running in development with hot-reload
./run-dev.sh) or gunicorn (
./run.sh). Install the optional deployment
dependencies for the latter:
pip install -e .[deployment]
Products will begin appearing one-by-one as the summaries are generated in the
background. If impatient, you can manually navigate to a product using
They are included when installing the test dependencies:
pip install -e .[test]
make lint to check your changes, and
make format to format your code
You may want to configure your editor to run black automatically on file save (see the Black page for directions), or install the pre-commit hook within Git:
A pre-commit config is provided to automatically format and check your code changes. This allows you to immediately catch and fix issues before you raise a failing pull request (which run the same checks under Travis).
If you don't use Conda, install pre-commit from pip:
pip install pre-commit
If you do use Conda, install from conda-forge (required because the pip version uses virtualenvs which are incompatible with Conda's environments)
conda install pre_commit
Now install the pre-commit hook to the current repository:
Your code will now be formatted and validated before each commit. You can also
invoke it manually by running
Can I use a different datacube environment?
Set ODC's environment variable before running the server:
You can always see which environment/settings will be used by running
datacube system check.
See the ODC documentation for config and datacube environments
Can I add custom scripts or text to the page (such as analytics)?
Create one of the following
Global include: for
<script>and other tags at the bottom of every page.
Footer text include. For human text such as Copyright statements.
echo "Server <strong>staging-1.test</strong>" > cubedash/templates/include-footer.env.html
*.env.html is the naming convention used for environment-specific templates: they are ignored by
How can I configure the deployment?
Add a file to the current directory called
# Default product to display (picks first available) CUBEDASH_DEFAULT_PRODUCTS = ('ls8_nbar_albers', 'ls7_nbar_albers') # Which field should we use when grouping products in the top menu? CUBEDASH_PRODUCT_GROUP_BY_FIELD = 'product_type' # Ungrouped products will be grouped together in this size. CUBEDASH_PRODUCT_GROUP_SIZE = 5 # Maximum search results CUBEDASH_HARD_SEARCH_LIMIT = 100 # Maximum number of source/derived datasets to show CUBEDASH_PROVENANCE_DISPLAY_LIMIT = 20 # Include load performance metrics in http response. CUBEDASH_SHOW_PERF_TIMES = False # Which theme to use (in the cubedash/themes folder) CUBEDASH_THEME = 'odc' # Customise '/stac' endpoint information STAC_ENDPOINT_ID = 'my-odc-explorer' STAC_ENDPOINT_TITLE = 'My ODC Explorer' STAC_ENDPOINT_DESCRIPTION = 'Optional Longer description of this endpoint' STAC_DEFAULT_PAGE_SIZE = 20 STAC_PAGE_SIZE_LIMIT = 1000
Why aren't stylesheets updating?
The CSS is compiled from Sass. Run
make style to rebuild them after a change,
or use your editor to watch for changes (PyCharm will prompt to do so).
How do I run the integration tests?
The integration tests run against a real postgres database, which is dropped and recreated between each test method:
Install the test dependencies:
pip install -e .[test]
Simple test setup
Set up a database on localhost that doesn't prompt for a password locally (eg. add credentials to
And the tests should be runnable with no configuration:
Custom test configuration (using other hosts, postgres servers)
.datacube_integration.conf file to your home directory in the same format as
datacube config files.
(You might already have one if you run datacube's integration tests)
Then run pytest:
Warning All data in this database will be dropped while running tests. Use a separate one from your normal development db.