The EBI Ontology Tools (consisting of OLS, OxO and Zooma) are available as docker containers. These containers are provided in the following 2 ways:
- Standalone applications: This is for users who want to only run an instance of OLS (or OxO or Zooma), rather than the complete Ontology Tools stack.
- Full Ontology Tools stack: This is for users who want to run the full Ontology Tools stack consisting of OLS, OxO and Zooma.
This repository contains the official Dockerised deployment pipeline for the Full Ontology Tools stack. For instructions for the standalone applications, see the OLS, OxO, or ZOOMA repositories respectively.
First, install Git, Docker and Docker compose. On an Ubuntu server:
apt install git docker.io
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
sudo ln -s /usr/local/bin/docker-compose /usr/bin/docker-compose
To use Docker without sudo
, make sure your user is in the docker
group. For example, if your username is spot
:
sudo usermod -aG docker spot
Next, clone this repository:
git clone https://github.com/EBISPOT/ontotools-docker.git
cd ontotools-docker
The configuration options for each of the OntoTools can be found in the config
directory. For example, to change the OLS configuration, edit the files in config/ols-config
.
Finally, run the redeploy.sh
script to deploy the OntoTools stack:
./redeploy.sh
To update the data in your OntoTools instances, run the update-data.sh
script:
./update-data.sh
It is possible to customise several branding options for the OntoTools by editing docker-compose.yml
:
ols.customisation.debrand
— If set to true, removes the EBI header and footer, documentation, and about pageols.customisation.title
— A custom title for your instance, e.g. "My OLS Instance"ols.customisation.short-title
— A shorter version of the custom title, e.g. "MYOLS"ols.customisation.description
— A description of the instanceols.customisation.org
— The organisation hosting your instanceols.customisation.web
— Url of the website for your organization.ols.customisation.twitter
— Handle to the Twitter account of your organisation.ols.customisation.issuesPage
— Url for the issue tracker for your organisation.ols.customisation.supportMail
— Email address where people can contact you.ols.customisation.hideGraphView
— Set to true to hide the graph viewols.customisation.errorMessage
— Message to show on error pagesols.customisation.ontologyAlias
— A custom word or phrase to use instead of "Ontology", e.g. "Data Dictionary"ols.customisation.ontologyAliasPlural
— AsontologyAlias
but plural, e.g. "Data Dictionaries"ols.customisation.oxoUrl
— The URL of an OxO instance to link to with a trailing slash e.g. `https://www.ebi.ac.uk/spot/oxo/
oxo.customisation.debrand
— If set to true, removes the EBI header and footer, documentation, and about pageoxo.customisation.title
— A custom title for your instance, e.g. "My OxO Instance"oxo.customisation.short-title
— A shorter version of the custom title, e.g. "MYOxO"oxo.customisation.description
— A description of the instanceoxo.customisation.org
— The organisation hosting your instance
The OLS/OXO/Zooma pipeline (just "pipeline" from now on) supports the following workflows:
- Deploying a new OLS/OXO/Zooma instance entirely using docker containers, using
docker-compose
. - Re-indexing OLS/OXO when the data changes.
The pipeline performs the following steps, which are encoded as as series of docker commands in redeploy.sh. Note that update-data.sh can be used to just reindex the data, without actually stopping most of the services. It is, due to the low overall runtime of the script, not necessary to use update-data.sh (you can simply always use redeploy.sh).
- Starting ols-solr and ols-mongo instance
- Import OLS config from config/ols-config.
- Index ontologies in OLS
- Start the remaining services (ols-web oxo-solr oxo-neo4j oxo-web). It is important that ols-web is not running at indexing time. This is a shortcoming in the OLS architecture and will likely be solved in future versions.
- Extract all datasets from OLS for OxO processing
- Load datasets (not mappings) into OxO Neo4j
- Extract the xref mappings from OLS and exports them into OxO format.
- Loads the mappings into OxO Neo4j
- Index mappings in solr
We have started to maintain a list of known custom OLS installations. Please create an issue if you want your installation to be listed as well.