The deployment process is based on Python 3.9+ scripts. It is recommended to setup a virtual environment first.
With venv:
python3 -m venv venv
source venv/bin/activate
With pyenv (get it here):
pyenv virtualenv 3.9.15 composer
pyenv activate composer
With (mini)conda (get it here):
conda create --name composer python=3.9
conda activate composer
Optional: install docker and docker-compose (get it here)
docker and docker-compose is needed to build and run the composer and database images on a Docker virtual environment. This step is an optional step
cd backend
# make sure your virtual env is activated
# and install the requirements
pip3 install --upgrade -r requirements.txt
# run the migrations
python3 manage.py migrate
# run the development server (https)
python3 manage.py runsslserver
the command below will start a docker container that maps/uses the backend folder into the container. It will also start the Django development server with DEBUG=True
BUILDKIT_PROGRESS=plain docker-compose -f docker-compose-dev.yaml up --build
the command below will start a docker container that runs the PostgreSQL database. To use it within your development Django server you need to set the following env vars in your launch(file)
USE_PG=True
DB_HOST=localhost
DB_PORT=5432
DB_NAME=composer
DB_USER=composer
DB_PASSWORD=composer
To start the database server run this command:
docker-compose --file docker-compose-db.yaml up --build
to stop the database run this command:
docker-compose -f docker-compose-db.yaml down
Example to run the backend using the Docker PostgreSQL database
cd backend
export USE_PG=True
export DB_HOST=localhost
export DB_PORT=5432
export DB_NAME=composer
export DB_USER=composer
export DB_PASSWORD=composer
python ./manage.py runsslserver
The git repository comes with some sample NLP data. This data can be ingested using the "ingest_nlp_sentence" management command
cd backend
python3 manage.py ingest_nlp_sentence ./composer/resources/pmc_oai_202209.csv
The git repository comes with some sampleAnatomical Entities data. This data can be ingested using the "ingest_anatomical_entities" management command
cd backend
python3 manage.py ingest_anatomical_entities ./composer/resources/anatomical_entities.csv
there will be a superuser created with username/password: admin/admin
browse the Django admin interface
For generating the frontend api client use the openapi generator install:
npm install -g @openapitools/openapi-generator-cli
and then run
cd frontend
./genapi.sh
browse to ORCID dev tools and create a new api configure settings.py with the new key and secret
Running on docker with docker-compose the command below will start two docker containers: backend server and database server the backend server has a persistent disk connected for it's media files the database server has a persistent disk connected for the data
docker-compose up --build
to stop:
docker-compose down
there will be a superuser created with username/password: admin/admin
browse the Django admin interface