Skip to content

Latest commit

 

History

History
180 lines (129 loc) · 6.83 KB

testing.md

File metadata and controls

180 lines (129 loc) · 6.83 KB

Tests

The sections below detail how to run the test suite.

(Table of contents automatically generated by https://luciopaiva.com/markdown-toc/).

Docker container for running Python tests

Introduction

It is difficult if not impossible to install the PDP on a typical development workstation (particularly since the transition to Ubuntu 20.04).

To fill that gap, we've defined Docker infrastructure that allows you to build and run a Docker container for testing that is equivalent to the production environment. The infrastructure is in docker/local-test/.

Instructions

  1. Advance prep

    Do each of the following things once per workstation.

    1. Configure Docker user namespace mapping.

      1. Clone pdp-docker.

      2. Follow the instructions in the pdp-docker documentation: Setting up Docker namespace remapping (with recommended parameters).

    2. Create docker/local-test/env-with-passwords.env from docker/local-run/env.env by adding passwords for the pcic_meta and crmp databases.

  2. Build the image

    The image need only be (re)built when:

    1. the project is first cloned, or
    2. the local-test Dockerfile changes.

    To build the image:

    docker-compose -f docker/local-test/docker-compose.yaml build
    
  3. Mount the gluster /storage volume

    Mount locally to /storage so that those data files are accessible on your workstation.

    sudo mount -t cifs -o username=XXXX@uvic.ca //pcic-storage.pcic.uvic.ca/storage/ /storage
    
  4. Start the test container

    docker-compose -f ./docker/local-test/docker-compose.yaml run --rm local-test 
    

    This starts the container, installs the local codebase (which may take over a minute), and gives you a bash shell. You should see a standard bash prompt.

  5. Change code and run tests

    Each time you wish to run tests on your local codebase, enter a suitable command at the prompt. For example:

    pytest -v -m "not local_only" --tb=short tests -x
    

    Do not stop the container until you have finished all changes and testing you wish to make for a given session. It is far more time efficient run tests inside the same container (avoiding startup time) than to restart the container for each test.

    Your local codebase is mounted to the container and installed in editable/development mode (pip install -e .). Any code changes you make externally (in your local filesystem) are reflected "live" inside the container.

  6. Stop the test container

    When you have completed a develop-and-test session and no longer wish to have the test container running, enter Ctrl+D on the command line. The container is stopped and automatically removed.

Notes and caveats

  1. As noted above, running tests in the test container in read/write mode leaves problematic pycache junk behind in the host filesystem. This can be cleaned up by running py3clean ..

JavaScript tests

All JS tests are found in the directory pdp/static/js/__test__.

No configuration is required to run the Node.js tests. Simply:

npm run test

Portal-specific tests

When each portal is instantiated, it loads its default dataset, which can be found in the javascript code for each portal. In order to completely and correctly load a dataset (and therefore entire portal for testing), the front end will make several queries to data services, all of which must be mocked with information for that particular dataset.

Please note that the testing harness does not set url_base, and portals instantiated for testing purposes have a $(location).href value of http://localhost/. This means that portals with an "archive" functionality that display two different sets of data but otherwise behave identically depending on the URL used to access them will always determine, using pdp_controls.isArchivePortal(), that they are not currently displaying the archived data, because the word "archive" is not present in their self-perceived URL when instantiated by tests. Therefore, they will load the non-archive choice when loading their default dataset for testing, and that is the dataset that needs to be mocked. There are no portals that currently have "archive" functionality, but they will in the future whenever we make new portals to replace the current ones.

Backend Mocks For Portal-specific tests

getCatalog

A mockup of the backend's catalog.json. JSON object with properties {[unique_id]: data_url}. The data urls should use the data_root set in app-test-helper.js. The default dataset needs to have an entry, but does not need to be the only entry.

getRasterAccordionData

A mockup of the backend's menu.json. JSON Object with portal-specific organization of datasets. The default dataset needs to be represented, but does not need to be the only entry.

getMetadata

A mockup if the backend's metadata.json. JSON object with units, max, and min attributes for the default dataset.

getNcwmLayerDDS

A mockup of a pyDAP DDS call. Needs to match the time metadata for the default dataset.

getNcwmsLayerDAS

A mockup of a pyDAP DAS call. Probably needs to include lat, lon, time and a variable matching the default dataset's variable, along with their load-bearing attributes like units, but "extra" attributes like REFERENCES are probably skippable.

ncWMS Mocks For Portal-Specific tests

Thankfully, full maps do not need to be loaded for tests, but some metadata is needed.

getNCWMSLayerCapabilities

A representation of ncWMS's GetCapabilities query, a very long xml document. Needs to include, at minimum, the unique ID of the dataset in question, and the default timestamp, which can be found in the portal's initialization javascript, in their usual places in the <LAYER> element.

The xml includes a long list of available palettes; all but the default one specified in the portal's map initial code are skippable and do not need to be mocked (default/ferretfor most variables; default/blueheat for some precipitation datasets, default/occam for some others).