To build and run Oncodash, you have two options:
- use docker-compose and run from a container (not working),
- run from your own local configuration.
Those instructions will build and run the backend and frontend webservers from within containers, not touching anything on your operating system.
The composed container runs an nginx proxy server that passes requests to backend and frontend containers.
- MacOS Docker Desktop
- Windows Docker Desktop
- Linux Docker CE
- docker-compose (not needed for MacOS/Windows)
- Running compose without sudo privileges
# Install dependencies
sudo apt install ca-certificates curl gnupg lsb-release checkinstall
# Download and install certificates
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
# Configure the package repository
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list
# Install packages from the repository
sudo apt update && sudo apt install docker-ce docker-ce-cli containerd.io
# Download the docker-compose executable at version 2.0.1
sudo checkinstall curl -L "https://github.com/docker/compose/releases/download/v2.0.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/docker-compose
# Make it executable
sudo chmod a+x /usr/local/bin/docker-compose
# Add at least the current user to the group able to execute docker
sudo usermod -aG docker ${USER}
You may need to restart your shell after installation.
Intel chip:
# Download the image
curl -L "https://desktop.docker.com/mac/main/amd64/Docker.dmg?utm_source=docker&utm_medium=webreferral&utm_campaign=docs-driven-download-mac-amd64" -o Docker.dmg
# Open the package
hdiutil attach Docker.dmg
# Install as an app
cp -a /Volumes/Docker/Docker.app /Applications/
You will be asked to accept some user contract at first launch.
-
Let's Encrypt installation:
- go to https://certbot.eff.org/
- select Software:Nginx System:Your-Host-OS
- follow the instruction to install certbot and get certificates
-
Build the back-end, front-end and nginx docker-images:
docker-compose build
-
Create a development SQLlite database inside the container and add tables to it:
docker-compose run --rm backend sh -c "python manage.py makemigrations" docker-compose run --rm backend sh -c "python manage.py migrate"
-
Populate a test database with network data (Explainer-app)
docker-compose run --rm backend sh -c "python manage.py flush --no-input" docker-compose run --rm backend sh -c "python manage.py populate -p /opt/app/path/to/indf.csv"
Note:
/opt/app/
points by default to wherever isoncodash/backend/
on your system. -
Populate a test database with clinical data and real timeline data. "<clinical filepath>" is the path of clinical data file and can be downloaded from eduuni. "<timeline filepath>" is the timeline data file and can be downloaded from the eduuni repository (DECIDER/Clinical Data/timeline.csv). The import takes several minutes, to shorten it you may reduce the timeline file by removing some lines.
docker-compose run --rm backend sh -c "python manage.py import_timelinerecords_and_clinicaldata -clinicalpath <clinical filepath> -timelinepath <timeline filepath>"
If the import shows some warnings, you may restart it with the
--errors-details
argument, to get which rows are affected. -
Create an account. Type:
docker-compose run --rm backend sh -c "python manage.py createsuperuser"
and then follow the prompt instruction.
-
Create CGI and OncoKB accounts to get corresponding tokens. Modify login email and tokens (OncoKB requires only token) in backend/backend/settings.py. Set also CRYPTOCODE password for encrypting the data.:
CGI_LOGIN = "" CGI_TOKEN = "" ONCOKB_TOKEN = "" CRYPTOCODE = ""
-
Import genomic variants to the database. "<filepath>" is the path of file containing annotated variants. The expected column separator is tabulator. Optionally, you can filter the data by column and value with --filter <column name> --<filter type> <value>. See --help for different filter types.
docker-compose run --rm backend sh -c "python manage.py import_genomic_variants --somatic_variants <filepath>"
docker-compose run --rm backend sh -c "python manage.py import_genomic_variants --copy_number_alterations <filepath>"
docker-compose run --rm backend sh -c "python manage.py import_genomic_variants --ascatestimates <filepath>"
docker-compose run --rm backend sh -c "python manage.py import_genomic_variants --oncokb_actionable_targets <filepath>"
-
Query OncoKB and Cancer Genome Interpreter actionable targets per patient identified by cohortcode.
# run this section if OncoKB and CGI results need to be removed from Django DB before querying docker compose run --rm backend sh -c "python manage.py shell" > from genomics.models import OncoKBAnnotation, CGIMutation, CGICopyNumberAlteration, CGIDrugPrescriptions > OncoKBAnnotation.objects.all().delete().delete(); CGICopyNumberAlteration.objects.all().delete(); CGIMutation.objects.all().delete(); CGIDrugPrescriptions.objects.all().delete()
docker-compose run --rm backend sh -c "python manage.py genomic_db_query_utils --oncokbcna --actionable --cohortcode=<cohortcode>" docker-compose run --rm backend sh -c "python manage.py genomic_db_query_utils --oncokbsnv --actionable --cohortcode=<cohortcode>"
docker-compose run --rm backend sh -c "python manage.py genomic_db_query_utils --cgiquery --cna --actionable --cohortcode=<cohortcode>" docker-compose run --rm backend sh -c "python manage.py genomic_db_query_utils --cgiquery --snv --actionable --cohortcode=<cohortcode>"
-
Run the images in containers: If you are running containers for the first time:
docker-compose up -d
-
Open up the browser at localhost.
-
Browsable API endpoints at localhost/api/explainer/networks/.
docker-compose run --rm backend sh -c "python manage.py test && flake8"
docker-compose run --rm nodeserver sh -c "npm test"
- nodejs
- Python >= 3.7
- go to https://certbot.eff.org/
- select Software:Nginx System:Your-Host-OS
- follow the instruction to install certbot and get certificates
-
Clone the repository & move to the
backend
directory:git clone https://github.com/oncodash/oncodash.git cd oncodash/backend/
-
Create a virtual environment. You have two options: Python's virtual environments or conda:
python3 -m venv backendEnv source backendEnv/bin/activate pip install -U pip
OR
conda create --name backendEnv python=3.7 conda activate backendEnv
-
Install python dependencies:
pip install -r requirements.txt
-
Create a development SQLlite database and add tables to it:
python manage.py makemigrations python manage.py migrate
-
Populate a test database with network data (Explainer-app)
python manage.py flush --no-input python manage.py populate -p /path/to/indication_table.csv
-
Populate a test database with clinical data and real timeline data. "<clinical filepath>" is the path of clinical data file and can be downloaded from eduuni. "<timeline filepath>" is the timeline data file and can be downloaded from the eduuni repository (DECIDER/Clinical Data/timeline.csv). The expected column separator is ";". The uploading takes several minutes, to shorten it you can reduce the timeline file by removing some lines.
python manage.py import_timelinerecords_and_clinicaldata -clinicalpath <clinical filepath> -timelinepath <timeline filepath>
-
Create an account. Type:
python manage.py createsuperuser
and then follow the prompt instruction.
-
Create CGI and OncoKB accounts to get corresponding tokens. Modify login email and tokens (OncoKB requires only token) in backend/backend/settings.py. Set also CRYPTOCODE password for encrypting the data.:
CGI_LOGIN = "" CGI_TOKEN = "" ONCOKB_TOKEN = "" CRYPTOCODE = ""
-
Import genomic variants to the database. "<filepath>" is the path of file containing annotated variants. The expected column separator is tabulator. Optionally, you can filter the data by column and value with --filter <column name> --<filter type> <value>. See --help for different filter types.
python manage.py import_genomic_variants --somatic_variants <filepath>
python manage.py import_genomic_variants --copy_number_alterations <filepath>
python manage.py import_genomic_variants --ascatestimates <filepath>
python manage.py import_genomic_variants --oncokb_actionable_targets <filepath>
-
Query OncoKB and Cancer Genome Interpreter actionable targets per patient identified by cohort code.
# run this section if OncoKB and CGI results need to be removed from Django DB before querying python manage.py shell > from genomics.models import OncoKBAnnotation, CGIMutation, CGICopyNumberAlteration, CGIDrugPrescriptions > OncoKBAnnotation.objects.all().delete().delete(); CGICopyNumberAlteration.objects.all().delete(); CGIMutation.objects.all().delete(); CGIDrugPrescriptions.objects.all().delete()
python manage.py genomic_db_query_utils --oncokbcna --actionable --cohortcode=<cohortcode> python manage.py genomic_db_query_utils --oncokbsnv --actionable --cohortcode=<cohortcode>
python manage.py genomic_db_query_utils --cgiquery --cna --actionable --cohortcode=<cohortcode> python manage.py genomic_db_query_utils --cgiquery --snv --actionable --cohortcode=<cohortcode>
-
Move to the
oncodash-app
directory:cd oncodash/oncodash-app/
-
Install node dependencies:
npm install --legacy-peer-deps
-
Run the back-end development server:
python manage.py runserver 0.0.0.0:8888
-
Run the front-end development server:
npm start
-
Open up the browser at localhost:3000/ and login with the previously created backend credentials. Tested with Chrome.
-
Browsable API endpoints at localhost:8888/api/explainer/networks/.
Backend:
python manage.py test && flake8
Frontend:
npm test